In this notebook, some template code has already been provided for you, and you will need to implement additional functionality to successfully complete this project. You will not need to modify the included code beyond what is requested. Sections that begin with '(IMPLEMENTATION)' in the header indicate that the following block of code will require additional functionality which you must provide. Instructions will be provided for each section, and the specifics of the implementation are marked in the code block with a 'TODO' statement. Please be sure to read the instructions carefully!
Note: Once you have completed all of the code implementations, you need to finalize your work by exporting the iPython Notebook as an HTML document. Before exporting the notebook to html, all of the code cells need to have been run so that reviewers can see the final implementation and output. You can then export the notebook by using the menu above and navigating to \n", "File -> Download as -> HTML (.html). Include the finished document along with this notebook as your submission.
In addition to implementing code, there will be questions that you must answer which relate to the project and your implementation. Each section where you will answer a question is preceded by a 'Question X' header. Carefully read each question and provide thorough answers in the following text boxes that begin with 'Answer:'. Your project submission will be evaluated based on your answers to each of the questions and the implementation you provide.
Note: Code and Markdown cells can be executed using the Shift + Enter keyboard shortcut. Markdown cells can be edited by double-clicking the cell to enter edit mode.
The rubric contains optional "Stand Out Suggestions" for enhancing the project beyond the minimum requirements. If you decide to pursue the "Stand Out Suggestions", you should include the code in this IPython notebook.
In this notebook, you will make the first steps towards developing an algorithm that could be used as part of a mobile or web app. At the end of this project, your code will accept any user-supplied image as input. If a dog is detected in the image, it will provide an estimate of the dog's breed. If a human is detected, it will provide an estimate of the dog breed that is most resembling. The image below displays potential sample output of your finished project (... but we expect that each student's algorithm will behave differently!).

In this real-world setting, you will need to piece together a series of models to perform different tasks; for instance, the algorithm that detects humans in an image will be different from the CNN that infers dog breed. There are many points of possible failure, and no perfect algorithm exists. Your imperfect solution will nonetheless create a fun user experience!
We break the notebook into separate steps. Feel free to use the links below to navigate the notebook.
In the code cell below, we import a dataset of dog images. We populate a few variables through the use of the load_files function from the scikit-learn library:
train_files, valid_files, test_files - numpy arrays containing file paths to imagestrain_targets, valid_targets, test_targets - numpy arrays containing onehot-encoded classification labels dog_names - list of string-valued dog breed names for translating labelsfrom sklearn.datasets import load_files
from keras.utils import np_utils
import numpy as np
from glob import glob
import pickle
# define function to load train, test, and validation datasets
def load_dataset(path):
data = load_files(path)
dog_files = np.array(data['filenames'])
dog_targets = np_utils.to_categorical(np.array(data['target']), 133)
return dog_files, dog_targets
# load train, test, and validation datasets
train_files, train_targets = load_dataset('dogImages/train')
valid_files, valid_targets = load_dataset('dogImages/valid')
test_files, test_targets = load_dataset('dogImages/test')
# load list of dog names
dog_names = [item[20:-1] for item in sorted(glob("dogImages/train/*/"))]
# print statistics about the dataset
print('There are %d total dog categories.' % len(dog_names))
print('There are %s total dog images.\n' % len(np.hstack([train_files, valid_files, test_files])))
print('There are %d training dog images.' % len(train_files))
print('There are %d validation dog images.' % len(valid_files))
print('There are %d test dog images.'% len(test_files))
C:\Users\Ran\Anaconda3\lib\site-packages\h5py\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`. from ._conv import register_converters as _register_converters Using TensorFlow backend.
There are 133 total dog categories. There are 8351 total dog images. There are 6680 training dog images. There are 835 validation dog images. There are 836 test dog images.
In the code cell below, we import a dataset of human images, where the file paths are stored in the numpy array human_files.
import random
random.seed(8675309)
# load filenames in shuffled human dataset
human_files = np.array(glob("lfw/*/*"))
random.shuffle(human_files)
# print statistics about the dataset
print('There are %d total human images.' % len(human_files))
There are 13233 total human images.
We use OpenCV's implementation of Haar feature-based cascade classifiers to detect human faces in images. OpenCV provides many pre-trained face detectors, stored as XML files on github. We have downloaded one of these detectors and stored it in the haarcascades directory.
In the next code cell, we demonstrate how to use this detector to find human faces in a sample image.
import cv2
import matplotlib.pyplot as plt
%matplotlib inline
# extract pre-trained face detector
face_cascade = cv2.CascadeClassifier('haarcascades/haarcascade_frontalface_alt.xml')
# load color (BGR) image
img = cv2.imread(human_files[3])
# convert BGR image to grayscale
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
# find faces in image
faces = face_cascade.detectMultiScale(gray)
# print number of faces detected in the image
print('Number of faces detected:', len(faces))
# get bounding box for each detected face
for (x,y,w,h) in faces:
# add bounding box to color image
cv2.rectangle(img,(x,y),(x+w,y+h),(255,0,0),2)
# convert BGR image to RGB for plotting
cv_rgb = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
# display the image, along with bounding box
plt.imshow(cv_rgb)
plt.show()
Number of faces detected: 1
Before using any of the face detectors, it is standard procedure to convert the images to grayscale. The detectMultiScale function executes the classifier stored in face_cascade and takes the grayscale image as a parameter.
In the above code, faces is a numpy array of detected faces, where each row corresponds to a detected face. Each detected face is a 1D array with four entries that specifies the bounding box of the detected face. The first two entries in the array (extracted in the above code as x and y) specify the horizontal and vertical positions of the top left corner of the bounding box. The last two entries in the array (extracted here as w and h) specify the width and height of the box.
We can use this procedure to write a function that returns True if a human face is detected in an image and False otherwise. This function, aptly named face_detector, takes a string-valued file path to an image as input and appears in the code block below.
# returns "True" if face is detected in image stored at img_path
def face_detector(img_path):
img = cv2.imread(img_path)
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
faces = face_cascade.detectMultiScale(gray)
return len(faces) > 0
Question 1: Use the code cell below to test the performance of the face_detector function.
human_files have a detected human face? dog_files have a detected human face? Ideally, we would like 100% of human images with a detected face and 0% of dog images with a detected face. You will see that our algorithm falls short of this goal, but still gives acceptable performance. We extract the file paths for the first 100 images from each of the datasets and store them in the numpy arrays human_files_short and dog_files_short.
Answer:
i found that for human images the face detector finds faces in 99% of the photos. while in dog images the face detector finds human faces in 12% of the photos.
human_files_short = human_files[:100]
dog_files_short = train_files[:100]
# Do NOT modify the code above this line.
## Test the performance of the face_detector algorithm
## on the images in human_files_short and dog_files_short.
def face_detect_percentage(img_array):
counter = 0
percentage = 0
for img_path in img_array:
counter += face_detector(img_path)
percentage = counter * 100 / len(img_array)
print("we found human faces in ", percentage, "% of the photos")
return percentage
print("for human images: ")
face_detect_percentage(human_files_short)
print("for dog images: ")
face_detect_percentage(dog_files_short)
for human images: we found human faces in 99.0 % of the photos for dog images: we found human faces in 12.0 % of the photos
12.0
Question 2: This algorithmic choice necessitates that we communicate to the user that we accept human images only when they provide a clear view of a face (otherwise, we risk having unneccessarily frustrated users!). In your opinion, is this a reasonable expectation to pose on the user? If not, can you think of a way to detect humans in images that does not necessitate an image with a clearly presented face?
Answer:
i think this is a legit expectation to pose on the user, since this is our algorithem and this is what his limitations are. same as math say don't divide by zero, that in my opinion is the same level of expectation. another note, this is even a good thing that we say those limitation before hands, so the user can decide for himself if he wants to use this algorithem with the knowen issue rather then working hard to put our algorithem and then getting pissed since it's not what he needs.
we suggest the face detector from OpenCV as a potential way to detect human images in your algorithm, but you are free to explore other approaches, especially approaches that make use of deep learning :). Please use the code cell below to design and test your own face detection algorithm. If you decide to pursue this optional task, report performance on each of the datasets.
## (Optional) TODO: Report the performance of another
## face detection algorithm on the LFW dataset
### Feel free to use as many code cells as needed.
In this section, we use a pre-trained ResNet-50 model to detect dogs in images. Our first line of code downloads the ResNet-50 model, along with weights that have been trained on ImageNet, a very large, very popular dataset used for image classification and other vision tasks. ImageNet contains over 10 million URLs, each linking to an image containing an object from one of 1000 categories. Given an image, this pre-trained ResNet-50 model returns a prediction (derived from the available categories in ImageNet) for the object that is contained in the image.
from keras.applications.resnet50 import ResNet50
# define ResNet50 model
ResNet50_model = ResNet50(weights='imagenet')
When using TensorFlow as backend, Keras CNNs require a 4D array (which we'll also refer to as a 4D tensor) as input, with shape
$$ (\text{nb_samples}, \text{rows}, \text{columns}, \text{channels}), $$
where nb_samples corresponds to the total number of images (or samples), and rows, columns, and channels correspond to the number of rows, columns, and channels for each image, respectively.
The path_to_tensor function below takes a string-valued file path to a color image as input and returns a 4D tensor suitable for supplying to a Keras CNN. The function first loads the image and resizes it to a square image that is $224 \times 224$ pixels. Next, the image is converted to an array, which is then resized to a 4D tensor. In this case, since we are working with color images, each image has three channels. Likewise, since we are processing a single image (or sample), the returned tensor will always have shape
$$ (1, 224, 224, 3). $$
The paths_to_tensor function takes a numpy array of string-valued image paths as input and returns a 4D tensor with shape
$$ (\text{nb_samples}, 224, 224, 3). $$
Here, nb_samples is the number of samples, or number of images, in the supplied array of image paths. It is best to think of nb_samples as the number of 3D tensors (where each 3D tensor corresponds to a different image) in your dataset!
from keras.preprocessing import image
from tqdm import tqdm
def path_to_tensor(img_path):
# loads RGB image as PIL.Image.Image type
img = image.load_img(img_path, target_size=(224, 224))
# convert PIL.Image.Image type to 3D tensor with shape (224, 224, 3)
x = image.img_to_array(img)
# convert 3D tensor to 4D tensor with shape (1, 224, 224, 3) and return 4D tensor
return np.expand_dims(x, axis=0)
def paths_to_tensor(img_paths):
list_of_tensors = [path_to_tensor(img_path) for img_path in tqdm(img_paths)]
return np.vstack(list_of_tensors)
Getting the 4D tensor ready for ResNet-50, and for any other pre-trained model in Keras, requires some additional processing. First, the RGB image is converted to BGR by reordering the channels. All pre-trained models have the additional normalization step that the mean pixel (expressed in RGB as $[103.939, 116.779, 123.68]$ and calculated from all pixels in all images in ImageNet) must be subtracted from every pixel in each image. This is implemented in the imported function preprocess_input. If you're curious, you can check the code for preprocess_input here.
Now that we have a way to format our image for supplying to ResNet-50, we are now ready to use the model to extract the predictions. This is accomplished with the predict method, which returns an array whose $i$-th entry is the model's predicted probability that the image belongs to the $i$-th ImageNet category. This is implemented in the ResNet50_predict_labels function below.
By taking the argmax of the predicted probability vector, we obtain an integer corresponding to the model's predicted object class, which we can identify with an object category through the use of this dictionary.
from keras.applications.resnet50 import preprocess_input, decode_predictions
def ResNet50_predict_labels(img_path):
# returns prediction vector for image located at img_path
img = preprocess_input(path_to_tensor(img_path))
return np.argmax(ResNet50_model.predict(img))
While looking at the dictionary, you will notice that the categories corresponding to dogs appear in an uninterrupted sequence and correspond to dictionary keys 151-268, inclusive, to include all categories from 'Chihuahua' to 'Mexican hairless'. Thus, in order to check to see if an image is predicted to contain a dog by the pre-trained ResNet-50 model, we need only check if the ResNet50_predict_labels function above returns a value between 151 and 268 (inclusive).
We use these ideas to complete the dog_detector function below, which returns True if a dog is detected in an image (and False if not).
### returns "True" if a dog is detected in the image stored at img_path
def dog_detector(img_path):
prediction = ResNet50_predict_labels(img_path)
return ((prediction <= 268) & (prediction >= 151))
Question 3: Use the code cell below to test the performance of your dog_detector function.
human_files_short have a detected dog? dog_files_short have a detected dog?Answer:
i see that for human images the dog_detector find dog in 1% of the images. and for dog images, the dog_detector find dogs in 100% of the images.
### Test the performance of the dog_detector function
### on the images in human_files_short and dog_files_short.
def dog_detect_percentage(img_array):
counter = 0
percentage = 0
for img_path in img_array:
counter += dog_detector(img_path)
percentage = counter * 100 / len(img_array)
print("we found dogs in ", percentage, "% of the photos")
return percentage
print("for human images: ")
dog_detect_percentage(human_files_short)
print("for dog images: ")
dog_detect_percentage(dog_files_short)
for human images: we found dogs in 1.0 % of the photos for dog images: we found dogs in 100.0 % of the photos
100.0
Now that we have functions for detecting humans and dogs in images, we need a way to predict breed from images. In this step, you will create a CNN that classifies dog breeds. You must create your CNN from scratch (so, you can't use transfer learning yet!), and you must attain a test accuracy of at least 1%. In Step 5 of this notebook, you will have the opportunity to use transfer learning to create a CNN that attains greatly improved accuracy.
Be careful with adding too many trainable layers! More parameters means longer training, which means you are more likely to need a GPU to accelerate the training process. Thankfully, Keras provides a handy estimate of the time that each epoch is likely to take; you can extrapolate this estimate to figure out how long it will take for your algorithm to train.
We mention that the task of assigning breed to dogs from images is considered exceptionally challenging. To see why, consider that even a human would have great difficulty in distinguishing between a Brittany and a Welsh Springer Spaniel.
| Brittany | Welsh Springer Spaniel |
|---|---|
![]() |
![]() |
It is not difficult to find other dog breed pairs with minimal inter-class variation (for instance, Curly-Coated Retrievers and American Water Spaniels).
| Curly-Coated Retriever | American Water Spaniel |
|---|---|
![]() |
![]() |
Likewise, recall that labradors come in yellow, chocolate, and black. Your vision-based algorithm will have to conquer this high intra-class variation to determine how to classify all of these different shades as the same breed.
| Yellow Labrador | Chocolate Labrador | Black Labrador |
|---|---|---|
![]() |
![]() |
![]() |
We also mention that random chance presents an exceptionally low bar: setting aside the fact that the classes are slightly imabalanced, a random guess will provide a correct answer roughly 1 in 133 times, which corresponds to an accuracy of less than 1%.
Remember that the practice is far ahead of the theory in deep learning. Experiment with many different architectures, and trust your intuition. And, of course, have fun!
We rescale the images by dividing every pixel in every image by 255.
from PIL import ImageFile
ImageFile.LOAD_TRUNCATED_IMAGES = True
# pre-process the data for Keras
train_tensors = paths_to_tensor(train_files).astype('float32')/255
valid_tensors = paths_to_tensor(valid_files).astype('float32')/255
test_tensors = paths_to_tensor(test_files).astype('float32')/255
100%|█████████████████████████████████████████████████████████████████████████████| 6680/6680 [00:48<00:00, 137.64it/s] 100%|███████████████████████████████████████████████████████████████████████████████| 835/835 [00:05<00:00, 150.52it/s] 100%|███████████████████████████████████████████████████████████████████████████████| 836/836 [00:05<00:00, 152.60it/s]
Create a CNN to classify dog breed. At the end of your code cell block, summarize the layers of your model by executing the line:
model.summary()
We have imported some Python modules to get you started, but feel free to import as many modules as you need. If you end up getting stuck, here's a hint that specifies a model that trains relatively fast on CPU and attains >1% test accuracy in 5 epochs:

Question 4: Outline the steps you took to get to your final CNN architecture and your reasoning at each step. If you chose to use the hinted architecture above, describe why you think that CNN architecture should work well for the image classification task.
Answer: i chose the hinted architecture. since i thought of building 2 layers of conv2d before using max pooling but it makes the network very big and need a lot more epochs and training data, so i picked the hinted architecture for the 3 layers of convd2 and maxpooling2d. I notice that after the convolution network it have better results when i use GlobalAveragePooling2D. and since after that the output is small shape there is no point of putting another dense layer on top of that.
from keras.layers import Conv2D, MaxPooling2D, AveragePooling2D, GlobalMaxPooling2D, GlobalAveragePooling2D
from keras.layers import Dropout, Flatten, Dense
from keras.models import Sequential
model = Sequential()
model.add(Conv2D(filters=16, kernel_size=2, padding='same', activation='relu',
input_shape=(224, 224, 3)))
model.add(MaxPooling2D(pool_size=2))
model.add(Conv2D(filters=32, kernel_size=2, padding='same', activation='relu'))
model.add(MaxPooling2D(pool_size=2))
model.add(Conv2D(filters=64, kernel_size=2, padding='same', activation='relu'))
model.add(MaxPooling2D(pool_size=2))
model.add(GlobalAveragePooling2D())
model.add(Dense(133, activation='softmax'))
model.summary()
_________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_1 (Conv2D) (None, 224, 224, 16) 208 _________________________________________________________________ max_pooling2d_2 (MaxPooling2 (None, 112, 112, 16) 0 _________________________________________________________________ conv2d_2 (Conv2D) (None, 112, 112, 32) 2080 _________________________________________________________________ max_pooling2d_3 (MaxPooling2 (None, 56, 56, 32) 0 _________________________________________________________________ conv2d_3 (Conv2D) (None, 56, 56, 64) 8256 _________________________________________________________________ max_pooling2d_4 (MaxPooling2 (None, 28, 28, 64) 0 _________________________________________________________________ global_average_pooling2d_1 ( (None, 64) 0 _________________________________________________________________ dense_1 (Dense) (None, 133) 8645 ================================================================= Total params: 19,189 Trainable params: 19,189 Non-trainable params: 0 _________________________________________________________________
model.compile(optimizer='rmsprop', loss='categorical_crossentropy', metrics=['accuracy'])
Train your model in the code cell below. Use model checkpointing to save the model that attains the best validation loss.
You are welcome to augment the training data, but this is not a requirement.
from keras.callbacks import ModelCheckpoint
### TODO: specify the number of epochs that you would like to use to train the model.
epochs = 5
### Do NOT modify the code below this line.
checkpointer = ModelCheckpoint(filepath='saved_models/weights.best.from_scratch1.hdf5',
verbose=1, save_best_only=True)
model.fit(train_tensors, train_targets,
validation_data=(valid_tensors, valid_targets),
epochs=epochs, batch_size=20, callbacks=[checkpointer], verbose=1)
Train on 6680 samples, validate on 835 samples Epoch 1/5 4060/6680 [=================>............] - ETA: 5:08 - loss: 4.8849 - acc: 0.0000e+0 - ETA: 3:39 - loss: 4.9135 - acc: 0.0000e+0 - ETA: 3:13 - loss: 4.9015 - acc: 0.0167 - ETA: 2:59 - loss: 4.8929 - acc: 0.012 - ETA: 2:49 - loss: 4.8898 - acc: 0.010 - ETA: 2:44 - loss: 4.8945 - acc: 0.008 - ETA: 2:40 - loss: 4.8913 - acc: 0.014 - ETA: 2:37 - loss: 4.8880 - acc: 0.018 - ETA: 2:34 - loss: 4.8826 - acc: 0.022 - ETA: 2:33 - loss: 4.8849 - acc: 0.020 - ETA: 2:30 - loss: 4.8865 - acc: 0.018 - ETA: 2:28 - loss: 4.8879 - acc: 0.020 - ETA: 2:26 - loss: 4.8880 - acc: 0.019 - ETA: 2:25 - loss: 4.8878 - acc: 0.021 - ETA: 2:24 - loss: 4.8891 - acc: 0.020 - ETA: 2:22 - loss: 4.8902 - acc: 0.018 - ETA: 2:21 - loss: 4.8922 - acc: 0.017 - ETA: 2:20 - loss: 4.8909 - acc: 0.019 - ETA: 2:19 - loss: 4.8925 - acc: 0.018 - ETA: 2:18 - loss: 4.8928 - acc: 0.017 - ETA: 2:17 - loss: 4.8930 - acc: 0.016 - ETA: 2:16 - loss: 4.8917 - acc: 0.015 - ETA: 2:16 - loss: 4.8914 - acc: 0.015 - ETA: 2:15 - loss: 4.8911 - acc: 0.016 - ETA: 2:14 - loss: 4.8923 - acc: 0.018 - ETA: 2:13 - loss: 4.8943 - acc: 0.017 - ETA: 2:12 - loss: 4.8948 - acc: 0.016 - ETA: 2:12 - loss: 4.8947 - acc: 0.017 - ETA: 2:11 - loss: 4.8957 - acc: 0.017 - ETA: 2:10 - loss: 4.8946 - acc: 0.016 - ETA: 2:10 - loss: 4.8945 - acc: 0.016 - ETA: 2:09 - loss: 4.8948 - acc: 0.015 - ETA: 2:09 - loss: 4.8945 - acc: 0.015 - ETA: 2:08 - loss: 4.8943 - acc: 0.014 - ETA: 2:07 - loss: 4.8941 - acc: 0.015 - ETA: 2:07 - loss: 4.8938 - acc: 0.015 - ETA: 2:06 - loss: 4.8937 - acc: 0.014 - ETA: 2:06 - loss: 4.8934 - acc: 0.014 - ETA: 2:05 - loss: 4.8937 - acc: 0.014 - ETA: 2:05 - loss: 4.8934 - acc: 0.013 - ETA: 2:04 - loss: 4.8941 - acc: 0.013 - ETA: 2:04 - loss: 4.8939 - acc: 0.015 - ETA: 2:03 - loss: 4.8940 - acc: 0.015 - ETA: 2:03 - loss: 4.8945 - acc: 0.014 - ETA: 2:02 - loss: 4.8939 - acc: 0.014 - ETA: 2:02 - loss: 4.8939 - acc: 0.014 - ETA: 2:01 - loss: 4.8938 - acc: 0.013 - ETA: 2:01 - loss: 4.8935 - acc: 0.014 - ETA: 2:00 - loss: 4.8933 - acc: 0.014 - ETA: 2:00 - loss: 4.8937 - acc: 0.014 - ETA: 1:59 - loss: 4.8938 - acc: 0.013 - ETA: 1:59 - loss: 4.8939 - acc: 0.013 - ETA: 1:58 - loss: 4.8940 - acc: 0.013 - ETA: 1:58 - loss: 4.8940 - acc: 0.013 - ETA: 1:57 - loss: 4.8934 - acc: 0.013 - ETA: 1:57 - loss: 4.8932 - acc: 0.013 - ETA: 1:56 - loss: 4.8930 - acc: 0.013 - ETA: 1:56 - loss: 4.8925 - acc: 0.012 - ETA: 1:55 - loss: 4.8918 - acc: 0.012 - ETA: 1:55 - loss: 4.8910 - acc: 0.012 - ETA: 1:54 - loss: 4.8916 - acc: 0.012 - ETA: 1:54 - loss: 4.8910 - acc: 0.012 - ETA: 1:53 - loss: 4.8912 - acc: 0.011 - ETA: 1:53 - loss: 4.8915 - acc: 0.011 - ETA: 1:52 - loss: 4.8913 - acc: 0.012 - ETA: 1:52 - loss: 4.8913 - acc: 0.012 - ETA: 1:52 - loss: 4.8912 - acc: 0.011 - ETA: 1:51 - loss: 4.8907 - acc: 0.011 - ETA: 1:51 - loss: 4.8907 - acc: 0.011 - ETA: 1:50 - loss: 4.8909 - acc: 0.011 - ETA: 1:50 - loss: 4.8910 - acc: 0.011 - ETA: 1:49 - loss: 4.8913 - acc: 0.011 - ETA: 1:49 - loss: 4.8910 - acc: 0.011 - ETA: 1:48 - loss: 4.8909 - acc: 0.011 - ETA: 1:48 - loss: 4.8905 - acc: 0.011 - ETA: 1:47 - loss: 4.8906 - acc: 0.011 - ETA: 1:47 - loss: 4.8906 - acc: 0.011 - ETA: 1:46 - loss: 4.8903 - acc: 0.010 - ETA: 1:46 - loss: 4.8901 - acc: 0.010 - ETA: 1:46 - loss: 4.8901 - acc: 0.010 - ETA: 1:45 - loss: 4.8905 - acc: 0.010 - ETA: 1:45 - loss: 4.8905 - acc: 0.010 - ETA: 1:44 - loss: 4.8906 - acc: 0.010 - ETA: 1:44 - loss: 4.8906 - acc: 0.010 - ETA: 1:43 - loss: 4.8904 - acc: 0.010 - ETA: 1:43 - loss: 4.8901 - acc: 0.010 - ETA: 1:42 - loss: 4.8897 - acc: 0.010 - ETA: 1:42 - loss: 4.8896 - acc: 0.010 - ETA: 1:42 - loss: 4.8890 - acc: 0.010 - ETA: 1:41 - loss: 4.8888 - acc: 0.010 - ETA: 1:41 - loss: 4.8889 - acc: 0.010 - ETA: 1:40 - loss: 4.8892 - acc: 0.010 - ETA: 1:40 - loss: 4.8888 - acc: 0.010 - ETA: 1:39 - loss: 4.8882 - acc: 0.010 - ETA: 1:39 - loss: 4.8885 - acc: 0.010 - ETA: 1:39 - loss: 4.8886 - acc: 0.009 - ETA: 1:38 - loss: 4.8891 - acc: 0.009 - ETA: 1:38 - loss: 4.8893 - acc: 0.009 - ETA: 1:37 - loss: 4.8890 - acc: 0.009 - ETA: 1:37 - loss: 4.8888 - acc: 0.009 - ETA: 1:36 - loss: 4.8888 - acc: 0.009 - ETA: 1:36 - loss: 4.8884 - acc: 0.009 - ETA: 1:36 - loss: 4.8882 - acc: 0.009 - ETA: 1:35 - loss: 4.8880 - acc: 0.009 - ETA: 1:35 - loss: 4.8880 - acc: 0.009 - ETA: 1:34 - loss: 4.8878 - acc: 0.009 - ETA: 1:34 - loss: 4.8880 - acc: 0.009 - ETA: 1:33 - loss: 4.8883 - acc: 0.009 - ETA: 1:33 - loss: 4.8883 - acc: 0.009 - ETA: 1:33 - loss: 4.8886 - acc: 0.009 - ETA: 1:32 - loss: 4.8885 - acc: 0.009 - ETA: 1:32 - loss: 4.8880 - acc: 0.009 - ETA: 1:31 - loss: 4.8880 - acc: 0.009 - ETA: 1:31 - loss: 4.8873 - acc: 0.010 - ETA: 1:30 - loss: 4.8869 - acc: 0.010 - ETA: 1:30 - loss: 4.8871 - acc: 0.009 - ETA: 1:30 - loss: 4.8864 - acc: 0.009 - ETA: 1:29 - loss: 4.8866 - acc: 0.009 - ETA: 1:29 - loss: 4.8869 - acc: 0.009 - ETA: 1:28 - loss: 4.8875 - acc: 0.010 - ETA: 1:28 - loss: 4.8879 - acc: 0.009 - ETA: 1:27 - loss: 4.8879 - acc: 0.009 - ETA: 1:27 - loss: 4.8877 - acc: 0.009 - ETA: 1:27 - loss: 4.8877 - acc: 0.009 - ETA: 1:26 - loss: 4.8873 - acc: 0.009 - ETA: 1:26 - loss: 4.8869 - acc: 0.009 - ETA: 1:25 - loss: 4.8872 - acc: 0.009 - ETA: 1:25 - loss: 4.8869 - acc: 0.010 - ETA: 1:24 - loss: 4.8872 - acc: 0.010 - ETA: 1:24 - loss: 4.8871 - acc: 0.010 - ETA: 1:24 - loss: 4.8876 - acc: 0.009 - ETA: 1:23 - loss: 4.8880 - acc: 0.009 - ETA: 1:23 - loss: 4.8876 - acc: 0.010 - ETA: 1:22 - loss: 4.8878 - acc: 0.010 - ETA: 1:22 - loss: 4.8873 - acc: 0.010 - ETA: 1:21 - loss: 4.8874 - acc: 0.009 - ETA: 1:21 - loss: 4.8874 - acc: 0.009 - ETA: 1:21 - loss: 4.8873 - acc: 0.009 - ETA: 1:20 - loss: 4.8872 - acc: 0.009 - ETA: 1:20 - loss: 4.8869 - acc: 0.009 - ETA: 1:19 - loss: 4.8865 - acc: 0.009 - ETA: 1:19 - loss: 4.8863 - acc: 0.010 - ETA: 1:19 - loss: 4.8860 - acc: 0.010 - ETA: 1:18 - loss: 4.8859 - acc: 0.010 - ETA: 1:18 - loss: 4.8857 - acc: 0.010 - ETA: 1:17 - loss: 4.8853 - acc: 0.010 - ETA: 1:17 - loss: 4.8855 - acc: 0.010 - ETA: 1:17 - loss: 4.8857 - acc: 0.010 - ETA: 1:16 - loss: 4.8855 - acc: 0.010 - ETA: 1:16 - loss: 4.8853 - acc: 0.010 - ETA: 1:15 - loss: 4.8854 - acc: 0.009 - ETA: 1:15 - loss: 4.8856 - acc: 0.009 - ETA: 1:14 - loss: 4.8855 - acc: 0.009 - ETA: 1:14 - loss: 4.8852 - acc: 0.009 - ETA: 1:14 - loss: 4.8854 - acc: 0.009 - ETA: 1:13 - loss: 4.8852 - acc: 0.009 - ETA: 1:13 - loss: 4.8851 - acc: 0.009 - ETA: 1:12 - loss: 4.8849 - acc: 0.009 - ETA: 1:12 - loss: 4.8848 - acc: 0.009 - ETA: 1:11 - loss: 4.8843 - acc: 0.009 - ETA: 1:11 - loss: 4.8846 - acc: 0.009 - ETA: 1:11 - loss: 4.8844 - acc: 0.009 - ETA: 1:10 - loss: 4.8841 - acc: 0.009 - ETA: 1:10 - loss: 4.8837 - acc: 0.009 - ETA: 1:09 - loss: 4.8839 - acc: 0.009 - ETA: 1:09 - loss: 4.8840 - acc: 0.009 - ETA: 1:09 - loss: 4.8839 - acc: 0.009 - ETA: 1:08 - loss: 4.8840 - acc: 0.009 - ETA: 1:08 - loss: 4.8840 - acc: 0.009 - ETA: 1:08 - loss: 4.8841 - acc: 0.009 - ETA: 1:07 - loss: 4.8843 - acc: 0.009 - ETA: 1:07 - loss: 4.8845 - acc: 0.009 - ETA: 1:06 - loss: 4.8845 - acc: 0.009 - ETA: 1:06 - loss: 4.8841 - acc: 0.010 - ETA: 1:06 - loss: 4.8839 - acc: 0.010 - ETA: 1:05 - loss: 4.8840 - acc: 0.009 - ETA: 1:05 - loss: 4.8844 - acc: 0.009 - ETA: 1:04 - loss: 4.8847 - acc: 0.009 - ETA: 1:04 - loss: 4.8847 - acc: 0.009 - ETA: 1:03 - loss: 4.8846 - acc: 0.010 - ETA: 1:03 - loss: 4.8848 - acc: 0.009 - ETA: 1:03 - loss: 4.8848 - acc: 0.009 - ETA: 1:02 - loss: 4.8849 - acc: 0.009 - ETA: 1:02 - loss: 4.8849 - acc: 0.009 - ETA: 1:01 - loss: 4.8848 - acc: 0.009 - ETA: 1:01 - loss: 4.8847 - acc: 0.009 - ETA: 1:00 - loss: 4.8847 - acc: 0.009 - ETA: 1:00 - loss: 4.8847 - acc: 0.009 - ETA: 1:00 - loss: 4.8845 - acc: 0.009 - ETA: 59s - loss: 4.8844 - acc: 0.009 - ETA: 59s - loss: 4.8841 - acc: 0.00 - ETA: 58s - loss: 4.8842 - acc: 0.00 - ETA: 58s - loss: 4.8841 - acc: 0.00 - ETA: 58s - loss: 4.8844 - acc: 0.00 - ETA: 57s - loss: 4.8844 - acc: 0.00 - ETA: 57s - loss: 4.8845 - acc: 0.00 - ETA: 56s - loss: 4.8845 - acc: 0.00 - ETA: 56s - loss: 4.8844 - acc: 0.00 - ETA: 56s - loss: 4.8843 - acc: 0.00 - ETA: 55s - loss: 4.8841 - acc: 0.00 - ETA: 55s - loss: 4.8841 - acc: 0.00 - ETA: 54s - loss: 4.8841 - acc: 0.00 - ETA: 54s - loss: 4.8843 - acc: 0.00946680/6680 [==============================] - ETA: 53s - loss: 4.8843 - acc: 0.00 - ETA: 53s - loss: 4.8842 - acc: 0.00 - ETA: 53s - loss: 4.8842 - acc: 0.00 - ETA: 52s - loss: 4.8843 - acc: 0.00 - ETA: 52s - loss: 4.8842 - acc: 0.00 - ETA: 51s - loss: 4.8837 - acc: 0.00 - ETA: 51s - loss: 4.8836 - acc: 0.00 - ETA: 51s - loss: 4.8838 - acc: 0.00 - ETA: 50s - loss: 4.8837 - acc: 0.00 - ETA: 50s - loss: 4.8837 - acc: 0.00 - ETA: 49s - loss: 4.8836 - acc: 0.00 - ETA: 49s - loss: 4.8836 - acc: 0.00 - ETA: 48s - loss: 4.8837 - acc: 0.00 - ETA: 48s - loss: 4.8834 - acc: 0.01 - ETA: 48s - loss: 4.8833 - acc: 0.01 - ETA: 47s - loss: 4.8830 - acc: 0.01 - ETA: 47s - loss: 4.8828 - acc: 0.01 - ETA: 46s - loss: 4.8831 - acc: 0.01 - ETA: 46s - loss: 4.8830 - acc: 0.00 - ETA: 46s - loss: 4.8828 - acc: 0.00 - ETA: 45s - loss: 4.8826 - acc: 0.00 - ETA: 45s - loss: 4.8827 - acc: 0.00 - ETA: 44s - loss: 4.8828 - acc: 0.00 - ETA: 44s - loss: 4.8828 - acc: 0.00 - ETA: 43s - loss: 4.8827 - acc: 0.00 - ETA: 43s - loss: 4.8827 - acc: 0.01 - ETA: 43s - loss: 4.8827 - acc: 0.01 - ETA: 42s - loss: 4.8830 - acc: 0.01 - ETA: 42s - loss: 4.8831 - acc: 0.01 - ETA: 41s - loss: 4.8832 - acc: 0.01 - ETA: 41s - loss: 4.8832 - acc: 0.01 - ETA: 41s - loss: 4.8831 - acc: 0.01 - ETA: 40s - loss: 4.8831 - acc: 0.01 - ETA: 40s - loss: 4.8830 - acc: 0.01 - ETA: 39s - loss: 4.8831 - acc: 0.01 - ETA: 39s - loss: 4.8832 - acc: 0.01 - ETA: 38s - loss: 4.8832 - acc: 0.01 - ETA: 38s - loss: 4.8832 - acc: 0.01 - ETA: 38s - loss: 4.8833 - acc: 0.00 - ETA: 37s - loss: 4.8834 - acc: 0.00 - ETA: 37s - loss: 4.8836 - acc: 0.00 - ETA: 36s - loss: 4.8836 - acc: 0.00 - ETA: 36s - loss: 4.8835 - acc: 0.00 - ETA: 36s - loss: 4.8835 - acc: 0.00 - ETA: 35s - loss: 4.8835 - acc: 0.00 - ETA: 35s - loss: 4.8834 - acc: 0.00 - ETA: 34s - loss: 4.8832 - acc: 0.01 - ETA: 34s - loss: 4.8832 - acc: 0.01 - ETA: 33s - loss: 4.8831 - acc: 0.01 - ETA: 33s - loss: 4.8831 - acc: 0.01 - ETA: 33s - loss: 4.8832 - acc: 0.01 - ETA: 32s - loss: 4.8836 - acc: 0.01 - ETA: 32s - loss: 4.8836 - acc: 0.01 - ETA: 31s - loss: 4.8835 - acc: 0.01 - ETA: 31s - loss: 4.8836 - acc: 0.01 - ETA: 31s - loss: 4.8836 - acc: 0.01 - ETA: 30s - loss: 4.8835 - acc: 0.01 - ETA: 30s - loss: 4.8835 - acc: 0.01 - ETA: 29s - loss: 4.8833 - acc: 0.00 - ETA: 29s - loss: 4.8834 - acc: 0.01 - ETA: 29s - loss: 4.8834 - acc: 0.01 - ETA: 28s - loss: 4.8835 - acc: 0.01 - ETA: 28s - loss: 4.8835 - acc: 0.01 - ETA: 27s - loss: 4.8836 - acc: 0.01 - ETA: 27s - loss: 4.8836 - acc: 0.01 - ETA: 26s - loss: 4.8834 - acc: 0.01 - ETA: 26s - loss: 4.8833 - acc: 0.01 - ETA: 26s - loss: 4.8833 - acc: 0.01 - ETA: 25s - loss: 4.8836 - acc: 0.00 - ETA: 25s - loss: 4.8834 - acc: 0.00 - ETA: 24s - loss: 4.8834 - acc: 0.01 - ETA: 24s - loss: 4.8833 - acc: 0.01 - ETA: 24s - loss: 4.8835 - acc: 0.01 - ETA: 23s - loss: 4.8834 - acc: 0.00 - ETA: 23s - loss: 4.8836 - acc: 0.00 - ETA: 22s - loss: 4.8835 - acc: 0.00 - ETA: 22s - loss: 4.8835 - acc: 0.00 - ETA: 21s - loss: 4.8837 - acc: 0.00 - ETA: 21s - loss: 4.8836 - acc: 0.00 - ETA: 21s - loss: 4.8837 - acc: 0.00 - ETA: 20s - loss: 4.8836 - acc: 0.00 - ETA: 20s - loss: 4.8836 - acc: 0.00 - ETA: 19s - loss: 4.8835 - acc: 0.00 - ETA: 19s - loss: 4.8836 - acc: 0.00 - ETA: 19s - loss: 4.8834 - acc: 0.00 - ETA: 18s - loss: 4.8834 - acc: 0.00 - ETA: 18s - loss: 4.8831 - acc: 0.00 - ETA: 17s - loss: 4.8830 - acc: 0.00 - ETA: 17s - loss: 4.8828 - acc: 0.00 - ETA: 17s - loss: 4.8826 - acc: 0.00 - ETA: 16s - loss: 4.8825 - acc: 0.00 - ETA: 16s - loss: 4.8826 - acc: 0.00 - ETA: 15s - loss: 4.8828 - acc: 0.00 - ETA: 15s - loss: 4.8826 - acc: 0.00 - ETA: 14s - loss: 4.8826 - acc: 0.00 - ETA: 14s - loss: 4.8827 - acc: 0.00 - ETA: 14s - loss: 4.8826 - acc: 0.00 - ETA: 13s - loss: 4.8825 - acc: 0.00 - ETA: 13s - loss: 4.8824 - acc: 0.00 - ETA: 12s - loss: 4.8823 - acc: 0.00 - ETA: 12s - loss: 4.8824 - acc: 0.00 - ETA: 12s - loss: 4.8822 - acc: 0.00 - ETA: 11s - loss: 4.8823 - acc: 0.00 - ETA: 11s - loss: 4.8821 - acc: 0.00 - ETA: 10s - loss: 4.8822 - acc: 0.01 - ETA: 10s - loss: 4.8821 - acc: 0.01 - ETA: 9s - loss: 4.8822 - acc: 0.0103 - ETA: 9s - loss: 4.8822 - acc: 0.010 - ETA: 9s - loss: 4.8819 - acc: 0.010 - ETA: 8s - loss: 4.8822 - acc: 0.010 - ETA: 8s - loss: 4.8822 - acc: 0.010 - ETA: 7s - loss: 4.8821 - acc: 0.010 - ETA: 7s - loss: 4.8822 - acc: 0.010 - ETA: 7s - loss: 4.8822 - acc: 0.010 - ETA: 6s - loss: 4.8822 - acc: 0.010 - ETA: 6s - loss: 4.8823 - acc: 0.010 - ETA: 5s - loss: 4.8824 - acc: 0.010 - ETA: 5s - loss: 4.8825 - acc: 0.010 - ETA: 4s - loss: 4.8824 - acc: 0.010 - ETA: 4s - loss: 4.8823 - acc: 0.010 - ETA: 4s - loss: 4.8824 - acc: 0.010 - ETA: 3s - loss: 4.8824 - acc: 0.010 - ETA: 3s - loss: 4.8825 - acc: 0.010 - ETA: 2s - loss: 4.8825 - acc: 0.010 - ETA: 2s - loss: 4.8826 - acc: 0.010 - ETA: 2s - loss: 4.8825 - acc: 0.010 - ETA: 1s - loss: 4.8826 - acc: 0.010 - ETA: 1s - loss: 4.8824 - acc: 0.010 - ETA: 0s - loss: 4.8824 - acc: 0.010 - ETA: 0s - loss: 4.8825 - acc: 0.010 - 145s 22ms/step - loss: 4.8825 - acc: 0.0106 - val_loss: 4.8673 - val_acc: 0.0108 Epoch 00001: val_loss improved from inf to 4.86729, saving model to saved_models/weights.best.from_scratch1.hdf5 Epoch 2/5 4100/6680 [=================>............] - ETA: 2:12 - loss: 4.8519 - acc: 0.050 - ETA: 2:13 - loss: 4.8645 - acc: 0.025 - ETA: 2:14 - loss: 4.8700 - acc: 0.016 - ETA: 2:16 - loss: 4.8575 - acc: 0.012 - ETA: 2:15 - loss: 4.8607 - acc: 0.010 - ETA: 2:15 - loss: 4.8570 - acc: 0.008 - ETA: 2:15 - loss: 4.8552 - acc: 0.014 - ETA: 2:15 - loss: 4.8516 - acc: 0.018 - ETA: 2:14 - loss: 4.8509 - acc: 0.022 - ETA: 2:13 - loss: 4.8484 - acc: 0.020 - ETA: 2:13 - loss: 4.8518 - acc: 0.018 - ETA: 2:12 - loss: 4.8529 - acc: 0.016 - ETA: 2:12 - loss: 4.8519 - acc: 0.015 - ETA: 2:11 - loss: 4.8606 - acc: 0.014 - ETA: 2:11 - loss: 4.8636 - acc: 0.013 - ETA: 2:10 - loss: 4.8642 - acc: 0.012 - ETA: 2:10 - loss: 4.8680 - acc: 0.011 - ETA: 2:09 - loss: 4.8656 - acc: 0.011 - ETA: 2:09 - loss: 4.8601 - acc: 0.010 - ETA: 2:08 - loss: 4.8589 - acc: 0.010 - ETA: 2:08 - loss: 4.8611 - acc: 0.009 - ETA: 2:08 - loss: 4.8645 - acc: 0.009 - ETA: 2:07 - loss: 4.8684 - acc: 0.008 - ETA: 2:07 - loss: 4.8687 - acc: 0.008 - ETA: 2:06 - loss: 4.8686 - acc: 0.008 - ETA: 2:06 - loss: 4.8691 - acc: 0.007 - ETA: 2:06 - loss: 4.8689 - acc: 0.007 - ETA: 2:05 - loss: 4.8685 - acc: 0.008 - ETA: 2:05 - loss: 4.8679 - acc: 0.008 - ETA: 2:05 - loss: 4.8690 - acc: 0.008 - ETA: 2:04 - loss: 4.8700 - acc: 0.008 - ETA: 2:04 - loss: 4.8706 - acc: 0.007 - ETA: 2:03 - loss: 4.8710 - acc: 0.007 - ETA: 2:03 - loss: 4.8719 - acc: 0.007 - ETA: 2:02 - loss: 4.8703 - acc: 0.008 - ETA: 2:02 - loss: 4.8722 - acc: 0.008 - ETA: 2:01 - loss: 4.8722 - acc: 0.008 - ETA: 2:01 - loss: 4.8724 - acc: 0.007 - ETA: 2:01 - loss: 4.8723 - acc: 0.007 - ETA: 2:00 - loss: 4.8712 - acc: 0.008 - ETA: 2:00 - loss: 4.8718 - acc: 0.008 - ETA: 2:00 - loss: 4.8714 - acc: 0.008 - ETA: 1:59 - loss: 4.8694 - acc: 0.008 - ETA: 1:59 - loss: 4.8675 - acc: 0.008 - ETA: 1:59 - loss: 4.8649 - acc: 0.010 - ETA: 1:58 - loss: 4.8661 - acc: 0.009 - ETA: 1:58 - loss: 4.8675 - acc: 0.009 - ETA: 1:57 - loss: 4.8681 - acc: 0.010 - ETA: 1:57 - loss: 4.8682 - acc: 0.010 - ETA: 1:56 - loss: 4.8686 - acc: 0.010 - ETA: 1:56 - loss: 4.8677 - acc: 0.009 - ETA: 1:56 - loss: 4.8687 - acc: 0.009 - ETA: 1:55 - loss: 4.8680 - acc: 0.009 - ETA: 1:55 - loss: 4.8692 - acc: 0.009 - ETA: 1:54 - loss: 4.8694 - acc: 0.009 - ETA: 1:54 - loss: 4.8688 - acc: 0.008 - ETA: 1:53 - loss: 4.8687 - acc: 0.008 - ETA: 1:53 - loss: 4.8677 - acc: 0.010 - ETA: 1:53 - loss: 4.8682 - acc: 0.010 - ETA: 1:52 - loss: 4.8685 - acc: 0.010 - ETA: 1:52 - loss: 4.8681 - acc: 0.009 - ETA: 1:51 - loss: 4.8698 - acc: 0.009 - ETA: 1:51 - loss: 4.8691 - acc: 0.009 - ETA: 1:51 - loss: 4.8697 - acc: 0.009 - ETA: 1:50 - loss: 4.8689 - acc: 0.009 - ETA: 1:50 - loss: 4.8696 - acc: 0.009 - ETA: 1:49 - loss: 4.8692 - acc: 0.010 - ETA: 1:49 - loss: 4.8686 - acc: 0.011 - ETA: 1:49 - loss: 4.8682 - acc: 0.010 - ETA: 1:48 - loss: 4.8687 - acc: 0.010 - ETA: 1:48 - loss: 4.8684 - acc: 0.011 - ETA: 1:47 - loss: 4.8681 - acc: 0.011 - ETA: 1:47 - loss: 4.8684 - acc: 0.011 - ETA: 1:47 - loss: 4.8678 - acc: 0.011 - ETA: 1:46 - loss: 4.8685 - acc: 0.011 - ETA: 1:46 - loss: 4.8689 - acc: 0.011 - ETA: 1:45 - loss: 4.8696 - acc: 0.011 - ETA: 1:45 - loss: 4.8695 - acc: 0.010 - ETA: 1:45 - loss: 4.8695 - acc: 0.010 - ETA: 1:44 - loss: 4.8690 - acc: 0.010 - ETA: 1:44 - loss: 4.8686 - acc: 0.010 - ETA: 1:44 - loss: 4.8700 - acc: 0.010 - ETA: 1:43 - loss: 4.8702 - acc: 0.010 - ETA: 1:43 - loss: 4.8700 - acc: 0.010 - ETA: 1:43 - loss: 4.8693 - acc: 0.010 - ETA: 1:42 - loss: 4.8684 - acc: 0.010 - ETA: 1:42 - loss: 4.8696 - acc: 0.010 - ETA: 1:42 - loss: 4.8702 - acc: 0.010 - ETA: 1:41 - loss: 4.8704 - acc: 0.010 - ETA: 1:41 - loss: 4.8708 - acc: 0.010 - ETA: 1:40 - loss: 4.8714 - acc: 0.009 - ETA: 1:40 - loss: 4.8713 - acc: 0.009 - ETA: 1:40 - loss: 4.8710 - acc: 0.010 - ETA: 1:39 - loss: 4.8706 - acc: 0.011 - ETA: 1:39 - loss: 4.8708 - acc: 0.011 - ETA: 1:39 - loss: 4.8715 - acc: 0.010 - ETA: 1:38 - loss: 4.8709 - acc: 0.011 - ETA: 1:38 - loss: 4.8711 - acc: 0.011 - ETA: 1:38 - loss: 4.8713 - acc: 0.011 - ETA: 1:37 - loss: 4.8716 - acc: 0.011 - ETA: 1:37 - loss: 4.8712 - acc: 0.011 - ETA: 1:36 - loss: 4.8703 - acc: 0.011 - ETA: 1:36 - loss: 4.8706 - acc: 0.011 - ETA: 1:36 - loss: 4.8706 - acc: 0.011 - ETA: 1:35 - loss: 4.8704 - acc: 0.011 - ETA: 1:35 - loss: 4.8703 - acc: 0.011 - ETA: 1:34 - loss: 4.8701 - acc: 0.011 - ETA: 1:34 - loss: 4.8701 - acc: 0.011 - ETA: 1:33 - loss: 4.8699 - acc: 0.011 - ETA: 1:33 - loss: 4.8696 - acc: 0.010 - ETA: 1:33 - loss: 4.8700 - acc: 0.010 - ETA: 1:32 - loss: 4.8693 - acc: 0.010 - ETA: 1:32 - loss: 4.8687 - acc: 0.010 - ETA: 1:31 - loss: 4.8689 - acc: 0.010 - ETA: 1:31 - loss: 4.8689 - acc: 0.010 - ETA: 1:31 - loss: 4.8686 - acc: 0.010 - ETA: 1:30 - loss: 4.8684 - acc: 0.010 - ETA: 1:30 - loss: 4.8691 - acc: 0.010 - ETA: 1:29 - loss: 4.8680 - acc: 0.011 - ETA: 1:29 - loss: 4.8673 - acc: 0.011 - ETA: 1:28 - loss: 4.8680 - acc: 0.011 - ETA: 1:28 - loss: 4.8679 - acc: 0.011 - ETA: 1:27 - loss: 4.8676 - acc: 0.011 - ETA: 1:27 - loss: 4.8675 - acc: 0.011 - ETA: 1:27 - loss: 4.8671 - acc: 0.012 - ETA: 1:26 - loss: 4.8665 - acc: 0.011 - ETA: 1:26 - loss: 4.8657 - acc: 0.011 - ETA: 1:25 - loss: 4.8655 - acc: 0.011 - ETA: 1:25 - loss: 4.8661 - acc: 0.011 - ETA: 1:25 - loss: 4.8657 - acc: 0.011 - ETA: 1:24 - loss: 4.8660 - acc: 0.011 - ETA: 1:24 - loss: 4.8656 - acc: 0.011 - ETA: 1:23 - loss: 4.8641 - acc: 0.012 - ETA: 1:23 - loss: 4.8651 - acc: 0.012 - ETA: 1:22 - loss: 4.8649 - acc: 0.012 - ETA: 1:22 - loss: 4.8641 - acc: 0.012 - ETA: 1:22 - loss: 4.8646 - acc: 0.012 - ETA: 1:21 - loss: 4.8635 - acc: 0.012 - ETA: 1:21 - loss: 4.8638 - acc: 0.011 - ETA: 1:20 - loss: 4.8641 - acc: 0.011 - ETA: 1:20 - loss: 4.8630 - acc: 0.011 - ETA: 1:19 - loss: 4.8632 - acc: 0.011 - ETA: 1:19 - loss: 4.8639 - acc: 0.011 - ETA: 1:19 - loss: 4.8632 - acc: 0.011 - ETA: 1:18 - loss: 4.8635 - acc: 0.012 - ETA: 1:18 - loss: 4.8637 - acc: 0.012 - ETA: 1:17 - loss: 4.8647 - acc: 0.012 - ETA: 1:17 - loss: 4.8651 - acc: 0.012 - ETA: 1:17 - loss: 4.8648 - acc: 0.012 - ETA: 1:16 - loss: 4.8648 - acc: 0.012 - ETA: 1:16 - loss: 4.8642 - acc: 0.012 - ETA: 1:15 - loss: 4.8642 - acc: 0.012 - ETA: 1:15 - loss: 4.8654 - acc: 0.012 - ETA: 1:14 - loss: 4.8649 - acc: 0.012 - ETA: 1:14 - loss: 4.8649 - acc: 0.012 - ETA: 1:14 - loss: 4.8641 - acc: 0.012 - ETA: 1:13 - loss: 4.8636 - acc: 0.012 - ETA: 1:13 - loss: 4.8636 - acc: 0.012 - ETA: 1:12 - loss: 4.8648 - acc: 0.012 - ETA: 1:12 - loss: 4.8647 - acc: 0.012 - ETA: 1:12 - loss: 4.8653 - acc: 0.012 - ETA: 1:11 - loss: 4.8652 - acc: 0.012 - ETA: 1:11 - loss: 4.8659 - acc: 0.012 - ETA: 1:10 - loss: 4.8657 - acc: 0.012 - ETA: 1:10 - loss: 4.8649 - acc: 0.012 - ETA: 1:10 - loss: 4.8650 - acc: 0.012 - ETA: 1:09 - loss: 4.8653 - acc: 0.012 - ETA: 1:09 - loss: 4.8655 - acc: 0.012 - ETA: 1:08 - loss: 4.8661 - acc: 0.012 - ETA: 1:08 - loss: 4.8660 - acc: 0.012 - ETA: 1:07 - loss: 4.8658 - acc: 0.012 - ETA: 1:07 - loss: 4.8656 - acc: 0.011 - ETA: 1:07 - loss: 4.8652 - acc: 0.011 - ETA: 1:06 - loss: 4.8656 - acc: 0.012 - ETA: 1:06 - loss: 4.8654 - acc: 0.012 - ETA: 1:05 - loss: 4.8652 - acc: 0.011 - ETA: 1:05 - loss: 4.8653 - acc: 0.011 - ETA: 1:04 - loss: 4.8653 - acc: 0.011 - ETA: 1:04 - loss: 4.8648 - acc: 0.011 - ETA: 1:04 - loss: 4.8655 - acc: 0.011 - ETA: 1:03 - loss: 4.8662 - acc: 0.011 - ETA: 1:03 - loss: 4.8662 - acc: 0.011 - ETA: 1:02 - loss: 4.8661 - acc: 0.011 - ETA: 1:02 - loss: 4.8659 - acc: 0.011 - ETA: 1:02 - loss: 4.8662 - acc: 0.011 - ETA: 1:01 - loss: 4.8662 - acc: 0.011 - ETA: 1:01 - loss: 4.8663 - acc: 0.011 - ETA: 1:00 - loss: 4.8666 - acc: 0.011 - ETA: 1:00 - loss: 4.8668 - acc: 0.011 - ETA: 59s - loss: 4.8669 - acc: 0.011 - ETA: 59s - loss: 4.8670 - acc: 0.01 - ETA: 59s - loss: 4.8669 - acc: 0.01 - ETA: 58s - loss: 4.8669 - acc: 0.01 - ETA: 58s - loss: 4.8671 - acc: 0.01 - ETA: 57s - loss: 4.8671 - acc: 0.01 - ETA: 57s - loss: 4.8672 - acc: 0.01 - ETA: 57s - loss: 4.8671 - acc: 0.01 - ETA: 56s - loss: 4.8671 - acc: 0.01 - ETA: 56s - loss: 4.8671 - acc: 0.01 - ETA: 55s - loss: 4.8670 - acc: 0.01 - ETA: 55s - loss: 4.8669 - acc: 0.01 - ETA: 54s - loss: 4.8670 - acc: 0.01 - ETA: 54s - loss: 4.8668 - acc: 0.01 - ETA: 54s - loss: 4.8669 - acc: 0.01 - ETA: 53s - loss: 4.8665 - acc: 0.01126680/6680 [==============================] - ETA: 53s - loss: 4.8660 - acc: 0.01 - ETA: 52s - loss: 4.8659 - acc: 0.01 - ETA: 52s - loss: 4.8657 - acc: 0.01 - ETA: 52s - loss: 4.8660 - acc: 0.01 - ETA: 51s - loss: 4.8661 - acc: 0.01 - ETA: 51s - loss: 4.8662 - acc: 0.01 - ETA: 50s - loss: 4.8659 - acc: 0.01 - ETA: 50s - loss: 4.8660 - acc: 0.01 - ETA: 49s - loss: 4.8658 - acc: 0.01 - ETA: 49s - loss: 4.8658 - acc: 0.01 - ETA: 49s - loss: 4.8658 - acc: 0.01 - ETA: 48s - loss: 4.8659 - acc: 0.01 - ETA: 48s - loss: 4.8661 - acc: 0.01 - ETA: 47s - loss: 4.8659 - acc: 0.01 - ETA: 47s - loss: 4.8661 - acc: 0.01 - ETA: 46s - loss: 4.8664 - acc: 0.01 - ETA: 46s - loss: 4.8663 - acc: 0.01 - ETA: 46s - loss: 4.8663 - acc: 0.01 - ETA: 45s - loss: 4.8664 - acc: 0.01 - ETA: 45s - loss: 4.8659 - acc: 0.01 - ETA: 44s - loss: 4.8659 - acc: 0.01 - ETA: 44s - loss: 4.8659 - acc: 0.01 - ETA: 44s - loss: 4.8661 - acc: 0.01 - ETA: 43s - loss: 4.8661 - acc: 0.01 - ETA: 43s - loss: 4.8659 - acc: 0.01 - ETA: 42s - loss: 4.8659 - acc: 0.01 - ETA: 42s - loss: 4.8654 - acc: 0.01 - ETA: 41s - loss: 4.8655 - acc: 0.01 - ETA: 41s - loss: 4.8652 - acc: 0.01 - ETA: 41s - loss: 4.8650 - acc: 0.01 - ETA: 40s - loss: 4.8645 - acc: 0.01 - ETA: 40s - loss: 4.8644 - acc: 0.01 - ETA: 39s - loss: 4.8641 - acc: 0.01 - ETA: 39s - loss: 4.8638 - acc: 0.01 - ETA: 39s - loss: 4.8637 - acc: 0.01 - ETA: 38s - loss: 4.8636 - acc: 0.01 - ETA: 38s - loss: 4.8630 - acc: 0.01 - ETA: 37s - loss: 4.8627 - acc: 0.01 - ETA: 37s - loss: 4.8628 - acc: 0.01 - ETA: 36s - loss: 4.8622 - acc: 0.01 - ETA: 36s - loss: 4.8625 - acc: 0.01 - ETA: 36s - loss: 4.8625 - acc: 0.01 - ETA: 35s - loss: 4.8628 - acc: 0.01 - ETA: 35s - loss: 4.8630 - acc: 0.01 - ETA: 34s - loss: 4.8630 - acc: 0.01 - ETA: 34s - loss: 4.8629 - acc: 0.01 - ETA: 34s - loss: 4.8629 - acc: 0.01 - ETA: 33s - loss: 4.8626 - acc: 0.01 - ETA: 33s - loss: 4.8630 - acc: 0.01 - ETA: 32s - loss: 4.8630 - acc: 0.01 - ETA: 32s - loss: 4.8629 - acc: 0.01 - ETA: 31s - loss: 4.8627 - acc: 0.01 - ETA: 31s - loss: 4.8626 - acc: 0.01 - ETA: 31s - loss: 4.8625 - acc: 0.01 - ETA: 30s - loss: 4.8623 - acc: 0.01 - ETA: 30s - loss: 4.8623 - acc: 0.01 - ETA: 29s - loss: 4.8623 - acc: 0.01 - ETA: 29s - loss: 4.8621 - acc: 0.01 - ETA: 29s - loss: 4.8620 - acc: 0.01 - ETA: 28s - loss: 4.8621 - acc: 0.01 - ETA: 28s - loss: 4.8622 - acc: 0.01 - ETA: 27s - loss: 4.8621 - acc: 0.01 - ETA: 27s - loss: 4.8620 - acc: 0.01 - ETA: 26s - loss: 4.8617 - acc: 0.01 - ETA: 26s - loss: 4.8617 - acc: 0.01 - ETA: 26s - loss: 4.8615 - acc: 0.01 - ETA: 25s - loss: 4.8617 - acc: 0.01 - ETA: 25s - loss: 4.8616 - acc: 0.01 - ETA: 24s - loss: 4.8617 - acc: 0.01 - ETA: 24s - loss: 4.8616 - acc: 0.01 - ETA: 24s - loss: 4.8613 - acc: 0.01 - ETA: 23s - loss: 4.8609 - acc: 0.01 - ETA: 23s - loss: 4.8607 - acc: 0.01 - ETA: 22s - loss: 4.8606 - acc: 0.01 - ETA: 22s - loss: 4.8601 - acc: 0.01 - ETA: 21s - loss: 4.8601 - acc: 0.01 - ETA: 21s - loss: 4.8602 - acc: 0.01 - ETA: 21s - loss: 4.8601 - acc: 0.01 - ETA: 20s - loss: 4.8605 - acc: 0.01 - ETA: 20s - loss: 4.8603 - acc: 0.01 - ETA: 19s - loss: 4.8604 - acc: 0.01 - ETA: 19s - loss: 4.8605 - acc: 0.01 - ETA: 19s - loss: 4.8603 - acc: 0.01 - ETA: 18s - loss: 4.8601 - acc: 0.01 - ETA: 18s - loss: 4.8598 - acc: 0.01 - ETA: 17s - loss: 4.8600 - acc: 0.01 - ETA: 17s - loss: 4.8600 - acc: 0.01 - ETA: 16s - loss: 4.8599 - acc: 0.01 - ETA: 16s - loss: 4.8595 - acc: 0.01 - ETA: 16s - loss: 4.8597 - acc: 0.01 - ETA: 15s - loss: 4.8599 - acc: 0.01 - ETA: 15s - loss: 4.8596 - acc: 0.01 - ETA: 14s - loss: 4.8597 - acc: 0.01 - ETA: 14s - loss: 4.8598 - acc: 0.01 - ETA: 14s - loss: 4.8599 - acc: 0.01 - ETA: 13s - loss: 4.8598 - acc: 0.01 - ETA: 13s - loss: 4.8599 - acc: 0.01 - ETA: 12s - loss: 4.8602 - acc: 0.01 - ETA: 12s - loss: 4.8602 - acc: 0.01 - ETA: 12s - loss: 4.8599 - acc: 0.01 - ETA: 11s - loss: 4.8594 - acc: 0.01 - ETA: 11s - loss: 4.8591 - acc: 0.01 - ETA: 10s - loss: 4.8594 - acc: 0.01 - ETA: 10s - loss: 4.8593 - acc: 0.01 - ETA: 9s - loss: 4.8593 - acc: 0.0135 - ETA: 9s - loss: 4.8595 - acc: 0.013 - ETA: 9s - loss: 4.8597 - acc: 0.013 - ETA: 8s - loss: 4.8598 - acc: 0.013 - ETA: 8s - loss: 4.8599 - acc: 0.013 - ETA: 7s - loss: 4.8598 - acc: 0.013 - ETA: 7s - loss: 4.8596 - acc: 0.013 - ETA: 7s - loss: 4.8596 - acc: 0.013 - ETA: 6s - loss: 4.8597 - acc: 0.013 - ETA: 6s - loss: 4.8598 - acc: 0.013 - ETA: 5s - loss: 4.8597 - acc: 0.013 - ETA: 5s - loss: 4.8595 - acc: 0.013 - ETA: 4s - loss: 4.8592 - acc: 0.013 - ETA: 4s - loss: 4.8590 - acc: 0.013 - ETA: 4s - loss: 4.8591 - acc: 0.013 - ETA: 3s - loss: 4.8594 - acc: 0.013 - ETA: 3s - loss: 4.8592 - acc: 0.013 - ETA: 2s - loss: 4.8592 - acc: 0.013 - ETA: 2s - loss: 4.8591 - acc: 0.013 - ETA: 2s - loss: 4.8588 - acc: 0.013 - ETA: 1s - loss: 4.8588 - acc: 0.013 - ETA: 1s - loss: 4.8588 - acc: 0.013 - ETA: 0s - loss: 4.8586 - acc: 0.013 - ETA: 0s - loss: 4.8587 - acc: 0.013 - 144s 22ms/step - loss: 4.8586 - acc: 0.0136 - val_loss: 4.8395 - val_acc: 0.0168 Epoch 00002: val_loss improved from 4.86729 to 4.83948, saving model to saved_models/weights.best.from_scratch1.hdf5 Epoch 3/5 4100/6680 [=================>............] - ETA: 2:18 - loss: 4.7617 - acc: 0.050 - ETA: 2:13 - loss: 4.7456 - acc: 0.050 - ETA: 2:14 - loss: 4.7370 - acc: 0.033 - ETA: 2:13 - loss: 4.7769 - acc: 0.025 - ETA: 2:12 - loss: 4.7997 - acc: 0.020 - ETA: 2:12 - loss: 4.8190 - acc: 0.016 - ETA: 2:12 - loss: 4.8121 - acc: 0.021 - ETA: 2:12 - loss: 4.8029 - acc: 0.018 - ETA: 2:12 - loss: 4.8171 - acc: 0.016 - ETA: 2:13 - loss: 4.8183 - acc: 0.015 - ETA: 2:12 - loss: 4.8215 - acc: 0.013 - ETA: 2:11 - loss: 4.8202 - acc: 0.012 - ETA: 2:11 - loss: 4.8145 - acc: 0.011 - ETA: 2:11 - loss: 4.8124 - acc: 0.010 - ETA: 2:10 - loss: 4.8121 - acc: 0.010 - ETA: 2:10 - loss: 4.8093 - acc: 0.009 - ETA: 2:09 - loss: 4.8115 - acc: 0.011 - ETA: 2:09 - loss: 4.8034 - acc: 0.011 - ETA: 2:09 - loss: 4.8021 - acc: 0.010 - ETA: 2:08 - loss: 4.8031 - acc: 0.015 - ETA: 2:08 - loss: 4.8063 - acc: 0.014 - ETA: 2:07 - loss: 4.8117 - acc: 0.015 - ETA: 2:07 - loss: 4.8104 - acc: 0.015 - ETA: 2:07 - loss: 4.8112 - acc: 0.014 - ETA: 2:06 - loss: 4.8192 - acc: 0.014 - ETA: 2:06 - loss: 4.8184 - acc: 0.013 - ETA: 2:06 - loss: 4.8193 - acc: 0.014 - ETA: 2:05 - loss: 4.8183 - acc: 0.014 - ETA: 2:05 - loss: 4.8225 - acc: 0.013 - ETA: 2:04 - loss: 4.8226 - acc: 0.015 - ETA: 2:04 - loss: 4.8265 - acc: 0.014 - ETA: 2:03 - loss: 4.8266 - acc: 0.015 - ETA: 2:03 - loss: 4.8247 - acc: 0.016 - ETA: 2:02 - loss: 4.8243 - acc: 0.016 - ETA: 2:02 - loss: 4.8244 - acc: 0.015 - ETA: 2:01 - loss: 4.8231 - acc: 0.015 - ETA: 2:01 - loss: 4.8226 - acc: 0.016 - ETA: 2:01 - loss: 4.8258 - acc: 0.015 - ETA: 2:01 - loss: 4.8232 - acc: 0.015 - ETA: 2:00 - loss: 4.8259 - acc: 0.015 - ETA: 2:00 - loss: 4.8284 - acc: 0.014 - ETA: 2:00 - loss: 4.8271 - acc: 0.015 - ETA: 1:59 - loss: 4.8274 - acc: 0.015 - ETA: 1:59 - loss: 4.8257 - acc: 0.014 - ETA: 1:58 - loss: 4.8236 - acc: 0.014 - ETA: 1:58 - loss: 4.8223 - acc: 0.014 - ETA: 1:57 - loss: 4.8233 - acc: 0.013 - ETA: 1:57 - loss: 4.8208 - acc: 0.013 - ETA: 1:57 - loss: 4.8175 - acc: 0.015 - ETA: 1:56 - loss: 4.8200 - acc: 0.015 - ETA: 1:56 - loss: 4.8195 - acc: 0.015 - ETA: 1:56 - loss: 4.8199 - acc: 0.015 - ETA: 1:55 - loss: 4.8203 - acc: 0.015 - ETA: 1:55 - loss: 4.8187 - acc: 0.014 - ETA: 1:54 - loss: 4.8191 - acc: 0.014 - ETA: 1:54 - loss: 4.8204 - acc: 0.015 - ETA: 1:53 - loss: 4.8182 - acc: 0.014 - ETA: 1:53 - loss: 4.8199 - acc: 0.014 - ETA: 1:53 - loss: 4.8199 - acc: 0.015 - ETA: 1:52 - loss: 4.8224 - acc: 0.015 - ETA: 1:52 - loss: 4.8233 - acc: 0.014 - ETA: 1:52 - loss: 4.8235 - acc: 0.014 - ETA: 1:51 - loss: 4.8233 - acc: 0.014 - ETA: 1:51 - loss: 4.8256 - acc: 0.014 - ETA: 1:50 - loss: 4.8233 - acc: 0.013 - ETA: 1:50 - loss: 4.8234 - acc: 0.013 - ETA: 1:50 - loss: 4.8242 - acc: 0.013 - ETA: 1:49 - loss: 4.8242 - acc: 0.013 - ETA: 1:49 - loss: 4.8231 - acc: 0.013 - ETA: 1:48 - loss: 4.8219 - acc: 0.012 - ETA: 1:48 - loss: 4.8229 - acc: 0.012 - ETA: 1:47 - loss: 4.8226 - acc: 0.012 - ETA: 1:47 - loss: 4.8221 - acc: 0.013 - ETA: 1:47 - loss: 4.8215 - acc: 0.013 - ETA: 1:46 - loss: 4.8223 - acc: 0.013 - ETA: 1:46 - loss: 4.8233 - acc: 0.013 - ETA: 1:45 - loss: 4.8238 - acc: 0.013 - ETA: 1:45 - loss: 4.8230 - acc: 0.012 - ETA: 1:45 - loss: 4.8225 - acc: 0.013 - ETA: 1:44 - loss: 4.8222 - acc: 0.013 - ETA: 1:44 - loss: 4.8215 - acc: 0.013 - ETA: 1:43 - loss: 4.8207 - acc: 0.012 - ETA: 1:43 - loss: 4.8227 - acc: 0.012 - ETA: 1:42 - loss: 4.8235 - acc: 0.012 - ETA: 1:42 - loss: 4.8232 - acc: 0.012 - ETA: 1:42 - loss: 4.8229 - acc: 0.012 - ETA: 1:41 - loss: 4.8249 - acc: 0.012 - ETA: 1:41 - loss: 4.8255 - acc: 0.012 - ETA: 1:40 - loss: 4.8252 - acc: 0.012 - ETA: 1:40 - loss: 4.8247 - acc: 0.012 - ETA: 1:40 - loss: 4.8238 - acc: 0.013 - ETA: 1:39 - loss: 4.8242 - acc: 0.014 - ETA: 1:39 - loss: 4.8233 - acc: 0.014 - ETA: 1:38 - loss: 4.8231 - acc: 0.013 - ETA: 1:38 - loss: 4.8226 - acc: 0.013 - ETA: 1:38 - loss: 4.8222 - acc: 0.013 - ETA: 1:37 - loss: 4.8221 - acc: 0.013 - ETA: 1:37 - loss: 4.8220 - acc: 0.013 - ETA: 1:36 - loss: 4.8213 - acc: 0.014 - ETA: 1:36 - loss: 4.8218 - acc: 0.014 - ETA: 1:35 - loss: 4.8220 - acc: 0.014 - ETA: 1:35 - loss: 4.8210 - acc: 0.014 - ETA: 1:35 - loss: 4.8207 - acc: 0.014 - ETA: 1:34 - loss: 4.8189 - acc: 0.014 - ETA: 1:34 - loss: 4.8187 - acc: 0.014 - ETA: 1:33 - loss: 4.8193 - acc: 0.014 - ETA: 1:33 - loss: 4.8195 - acc: 0.015 - ETA: 1:33 - loss: 4.8207 - acc: 0.014 - ETA: 1:32 - loss: 4.8212 - acc: 0.014 - ETA: 1:32 - loss: 4.8212 - acc: 0.015 - ETA: 1:31 - loss: 4.8216 - acc: 0.015 - ETA: 1:31 - loss: 4.8225 - acc: 0.015 - ETA: 1:30 - loss: 4.8227 - acc: 0.015 - ETA: 1:30 - loss: 4.8217 - acc: 0.014 - ETA: 1:30 - loss: 4.8216 - acc: 0.015 - ETA: 1:29 - loss: 4.8216 - acc: 0.015 - ETA: 1:29 - loss: 4.8225 - acc: 0.015 - ETA: 1:28 - loss: 4.8222 - acc: 0.015 - ETA: 1:28 - loss: 4.8221 - acc: 0.015 - ETA: 1:28 - loss: 4.8216 - acc: 0.015 - ETA: 1:27 - loss: 4.8214 - acc: 0.015 - ETA: 1:27 - loss: 4.8211 - acc: 0.015 - ETA: 1:26 - loss: 4.8202 - acc: 0.015 - ETA: 1:26 - loss: 4.8205 - acc: 0.015 - ETA: 1:26 - loss: 4.8205 - acc: 0.015 - ETA: 1:25 - loss: 4.8210 - acc: 0.015 - ETA: 1:25 - loss: 4.8213 - acc: 0.016 - ETA: 1:24 - loss: 4.8216 - acc: 0.016 - ETA: 1:24 - loss: 4.8225 - acc: 0.015 - ETA: 1:24 - loss: 4.8224 - acc: 0.016 - ETA: 1:23 - loss: 4.8220 - acc: 0.016 - ETA: 1:23 - loss: 4.8228 - acc: 0.015 - ETA: 1:22 - loss: 4.8228 - acc: 0.015 - ETA: 1:22 - loss: 4.8230 - acc: 0.015 - ETA: 1:21 - loss: 4.8231 - acc: 0.015 - ETA: 1:21 - loss: 4.8232 - acc: 0.015 - ETA: 1:21 - loss: 4.8238 - acc: 0.015 - ETA: 1:20 - loss: 4.8240 - acc: 0.015 - ETA: 1:20 - loss: 4.8245 - acc: 0.015 - ETA: 1:19 - loss: 4.8241 - acc: 0.016 - ETA: 1:19 - loss: 4.8241 - acc: 0.016 - ETA: 1:18 - loss: 4.8244 - acc: 0.016 - ETA: 1:18 - loss: 4.8239 - acc: 0.016 - ETA: 1:18 - loss: 4.8246 - acc: 0.016 - ETA: 1:17 - loss: 4.8243 - acc: 0.016 - ETA: 1:17 - loss: 4.8250 - acc: 0.016 - ETA: 1:16 - loss: 4.8252 - acc: 0.016 - ETA: 1:16 - loss: 4.8251 - acc: 0.016 - ETA: 1:16 - loss: 4.8245 - acc: 0.016 - ETA: 1:15 - loss: 4.8238 - acc: 0.016 - ETA: 1:15 - loss: 4.8235 - acc: 0.016 - ETA: 1:14 - loss: 4.8244 - acc: 0.016 - ETA: 1:14 - loss: 4.8237 - acc: 0.016 - ETA: 1:14 - loss: 4.8236 - acc: 0.016 - ETA: 1:13 - loss: 4.8239 - acc: 0.016 - ETA: 1:13 - loss: 4.8240 - acc: 0.016 - ETA: 1:12 - loss: 4.8241 - acc: 0.016 - ETA: 1:12 - loss: 4.8247 - acc: 0.016 - ETA: 1:11 - loss: 4.8245 - acc: 0.017 - ETA: 1:11 - loss: 4.8232 - acc: 0.017 - ETA: 1:11 - loss: 4.8230 - acc: 0.017 - ETA: 1:10 - loss: 4.8231 - acc: 0.017 - ETA: 1:10 - loss: 4.8237 - acc: 0.017 - ETA: 1:09 - loss: 4.8243 - acc: 0.017 - ETA: 1:09 - loss: 4.8248 - acc: 0.017 - ETA: 1:09 - loss: 4.8253 - acc: 0.017 - ETA: 1:08 - loss: 4.8248 - acc: 0.017 - ETA: 1:08 - loss: 4.8249 - acc: 0.017 - ETA: 1:07 - loss: 4.8250 - acc: 0.017 - ETA: 1:07 - loss: 4.8256 - acc: 0.017 - ETA: 1:07 - loss: 4.8254 - acc: 0.017 - ETA: 1:06 - loss: 4.8246 - acc: 0.017 - ETA: 1:06 - loss: 4.8243 - acc: 0.017 - ETA: 1:05 - loss: 4.8235 - acc: 0.018 - ETA: 1:05 - loss: 4.8228 - acc: 0.018 - ETA: 1:05 - loss: 4.8231 - acc: 0.018 - ETA: 1:04 - loss: 4.8230 - acc: 0.018 - ETA: 1:04 - loss: 4.8232 - acc: 0.019 - ETA: 1:03 - loss: 4.8233 - acc: 0.019 - ETA: 1:03 - loss: 4.8233 - acc: 0.019 - ETA: 1:02 - loss: 4.8239 - acc: 0.019 - ETA: 1:02 - loss: 4.8238 - acc: 0.019 - ETA: 1:02 - loss: 4.8234 - acc: 0.019 - ETA: 1:01 - loss: 4.8233 - acc: 0.019 - ETA: 1:01 - loss: 4.8237 - acc: 0.020 - ETA: 1:00 - loss: 4.8234 - acc: 0.019 - ETA: 1:00 - loss: 4.8236 - acc: 0.019 - ETA: 1:00 - loss: 4.8240 - acc: 0.019 - ETA: 59s - loss: 4.8234 - acc: 0.019 - ETA: 59s - loss: 4.8233 - acc: 0.01 - ETA: 58s - loss: 4.8230 - acc: 0.02 - ETA: 58s - loss: 4.8230 - acc: 0.02 - ETA: 57s - loss: 4.8224 - acc: 0.02 - ETA: 57s - loss: 4.8223 - acc: 0.02 - ETA: 57s - loss: 4.8226 - acc: 0.02 - ETA: 56s - loss: 4.8227 - acc: 0.02 - ETA: 56s - loss: 4.8231 - acc: 0.02 - ETA: 55s - loss: 4.8234 - acc: 0.02 - ETA: 55s - loss: 4.8237 - acc: 0.02 - ETA: 55s - loss: 4.8239 - acc: 0.02 - ETA: 54s - loss: 4.8236 - acc: 0.01 - ETA: 54s - loss: 4.8238 - acc: 0.02 - ETA: 53s - loss: 4.8241 - acc: 0.02 - ETA: 53s - loss: 4.8238 - acc: 0.01 - ETA: 53s - loss: 4.8234 - acc: 0.01986680/6680 [==============================] - ETA: 52s - loss: 4.8231 - acc: 0.01 - ETA: 52s - loss: 4.8228 - acc: 0.01 - ETA: 51s - loss: 4.8222 - acc: 0.02 - ETA: 51s - loss: 4.8225 - acc: 0.01 - ETA: 50s - loss: 4.8229 - acc: 0.01 - ETA: 50s - loss: 4.8220 - acc: 0.01 - ETA: 50s - loss: 4.8217 - acc: 0.01 - ETA: 49s - loss: 4.8221 - acc: 0.01 - ETA: 49s - loss: 4.8226 - acc: 0.01 - ETA: 48s - loss: 4.8229 - acc: 0.01 - ETA: 48s - loss: 4.8232 - acc: 0.01 - ETA: 48s - loss: 4.8234 - acc: 0.01 - ETA: 47s - loss: 4.8240 - acc: 0.01 - ETA: 47s - loss: 4.8241 - acc: 0.01 - ETA: 46s - loss: 4.8237 - acc: 0.01 - ETA: 46s - loss: 4.8235 - acc: 0.01 - ETA: 46s - loss: 4.8231 - acc: 0.01 - ETA: 45s - loss: 4.8239 - acc: 0.01 - ETA: 45s - loss: 4.8236 - acc: 0.01 - ETA: 44s - loss: 4.8235 - acc: 0.01 - ETA: 44s - loss: 4.8235 - acc: 0.01 - ETA: 43s - loss: 4.8241 - acc: 0.01 - ETA: 43s - loss: 4.8244 - acc: 0.01 - ETA: 43s - loss: 4.8243 - acc: 0.01 - ETA: 42s - loss: 4.8245 - acc: 0.01 - ETA: 42s - loss: 4.8243 - acc: 0.01 - ETA: 41s - loss: 4.8245 - acc: 0.01 - ETA: 41s - loss: 4.8246 - acc: 0.01 - ETA: 41s - loss: 4.8240 - acc: 0.01 - ETA: 40s - loss: 4.8237 - acc: 0.01 - ETA: 40s - loss: 4.8238 - acc: 0.01 - ETA: 39s - loss: 4.8234 - acc: 0.01 - ETA: 39s - loss: 4.8235 - acc: 0.02 - ETA: 39s - loss: 4.8234 - acc: 0.02 - ETA: 38s - loss: 4.8235 - acc: 0.02 - ETA: 38s - loss: 4.8240 - acc: 0.02 - ETA: 37s - loss: 4.8239 - acc: 0.02 - ETA: 37s - loss: 4.8237 - acc: 0.02 - ETA: 36s - loss: 4.8240 - acc: 0.02 - ETA: 36s - loss: 4.8236 - acc: 0.02 - ETA: 36s - loss: 4.8232 - acc: 0.01 - ETA: 35s - loss: 4.8236 - acc: 0.01 - ETA: 35s - loss: 4.8233 - acc: 0.01 - ETA: 34s - loss: 4.8225 - acc: 0.01 - ETA: 34s - loss: 4.8220 - acc: 0.01 - ETA: 34s - loss: 4.8226 - acc: 0.01 - ETA: 33s - loss: 4.8223 - acc: 0.01 - ETA: 33s - loss: 4.8212 - acc: 0.01 - ETA: 32s - loss: 4.8216 - acc: 0.01 - ETA: 32s - loss: 4.8214 - acc: 0.01 - ETA: 32s - loss: 4.8214 - acc: 0.01 - ETA: 31s - loss: 4.8209 - acc: 0.01 - ETA: 31s - loss: 4.8208 - acc: 0.01 - ETA: 30s - loss: 4.8208 - acc: 0.01 - ETA: 30s - loss: 4.8205 - acc: 0.01 - ETA: 29s - loss: 4.8208 - acc: 0.01 - ETA: 29s - loss: 4.8201 - acc: 0.01 - ETA: 29s - loss: 4.8207 - acc: 0.01 - ETA: 28s - loss: 4.8208 - acc: 0.01 - ETA: 28s - loss: 4.8210 - acc: 0.01 - ETA: 27s - loss: 4.8204 - acc: 0.01 - ETA: 27s - loss: 4.8203 - acc: 0.01 - ETA: 27s - loss: 4.8205 - acc: 0.01 - ETA: 26s - loss: 4.8204 - acc: 0.01 - ETA: 26s - loss: 4.8208 - acc: 0.01 - ETA: 25s - loss: 4.8210 - acc: 0.01 - ETA: 25s - loss: 4.8212 - acc: 0.01 - ETA: 25s - loss: 4.8217 - acc: 0.01 - ETA: 24s - loss: 4.8219 - acc: 0.01 - ETA: 24s - loss: 4.8228 - acc: 0.01 - ETA: 23s - loss: 4.8227 - acc: 0.01 - ETA: 23s - loss: 4.8228 - acc: 0.01 - ETA: 23s - loss: 4.8226 - acc: 0.01 - ETA: 22s - loss: 4.8227 - acc: 0.01 - ETA: 22s - loss: 4.8224 - acc: 0.01 - ETA: 21s - loss: 4.8224 - acc: 0.01 - ETA: 21s - loss: 4.8225 - acc: 0.01 - ETA: 21s - loss: 4.8223 - acc: 0.01 - ETA: 20s - loss: 4.8223 - acc: 0.01 - ETA: 20s - loss: 4.8223 - acc: 0.01 - ETA: 19s - loss: 4.8225 - acc: 0.01 - ETA: 19s - loss: 4.8226 - acc: 0.01 - ETA: 18s - loss: 4.8223 - acc: 0.01 - ETA: 18s - loss: 4.8226 - acc: 0.01 - ETA: 18s - loss: 4.8227 - acc: 0.01 - ETA: 17s - loss: 4.8224 - acc: 0.01 - ETA: 17s - loss: 4.8222 - acc: 0.01 - ETA: 16s - loss: 4.8214 - acc: 0.01 - ETA: 16s - loss: 4.8218 - acc: 0.01 - ETA: 16s - loss: 4.8218 - acc: 0.01 - ETA: 15s - loss: 4.8215 - acc: 0.01 - ETA: 15s - loss: 4.8214 - acc: 0.01 - ETA: 14s - loss: 4.8206 - acc: 0.01 - ETA: 14s - loss: 4.8205 - acc: 0.01 - ETA: 13s - loss: 4.8206 - acc: 0.01 - ETA: 13s - loss: 4.8210 - acc: 0.01 - ETA: 13s - loss: 4.8210 - acc: 0.01 - ETA: 12s - loss: 4.8203 - acc: 0.01 - ETA: 12s - loss: 4.8202 - acc: 0.01 - ETA: 11s - loss: 4.8199 - acc: 0.01 - ETA: 11s - loss: 4.8195 - acc: 0.01 - ETA: 11s - loss: 4.8196 - acc: 0.01 - ETA: 10s - loss: 4.8196 - acc: 0.01 - ETA: 10s - loss: 4.8198 - acc: 0.01 - ETA: 9s - loss: 4.8194 - acc: 0.0190 - ETA: 9s - loss: 4.8197 - acc: 0.019 - ETA: 9s - loss: 4.8198 - acc: 0.019 - ETA: 8s - loss: 4.8194 - acc: 0.019 - ETA: 8s - loss: 4.8197 - acc: 0.019 - ETA: 7s - loss: 4.8192 - acc: 0.019 - ETA: 7s - loss: 4.8191 - acc: 0.019 - ETA: 6s - loss: 4.8185 - acc: 0.018 - ETA: 6s - loss: 4.8184 - acc: 0.018 - ETA: 6s - loss: 4.8185 - acc: 0.018 - ETA: 5s - loss: 4.8186 - acc: 0.018 - ETA: 5s - loss: 4.8182 - acc: 0.018 - ETA: 4s - loss: 4.8181 - acc: 0.018 - ETA: 4s - loss: 4.8182 - acc: 0.018 - ETA: 4s - loss: 4.8183 - acc: 0.018 - ETA: 3s - loss: 4.8180 - acc: 0.018 - ETA: 3s - loss: 4.8176 - acc: 0.018 - ETA: 2s - loss: 4.8176 - acc: 0.019 - ETA: 2s - loss: 4.8171 - acc: 0.019 - ETA: 2s - loss: 4.8169 - acc: 0.019 - ETA: 1s - loss: 4.8172 - acc: 0.019 - ETA: 1s - loss: 4.8170 - acc: 0.019 - ETA: 0s - loss: 4.8166 - acc: 0.019 - ETA: 0s - loss: 4.8163 - acc: 0.019 - 144s 22ms/step - loss: 4.8160 - acc: 0.0190 - val_loss: 4.8198 - val_acc: 0.0204 Epoch 00003: val_loss improved from 4.83948 to 4.81981, saving model to saved_models/weights.best.from_scratch1.hdf5 Epoch 4/5 4100/6680 [=================>............] - ETA: 2:15 - loss: 4.6775 - acc: 0.050 - ETA: 2:13 - loss: 4.7399 - acc: 0.025 - ETA: 2:14 - loss: 4.7730 - acc: 0.016 - ETA: 2:14 - loss: 4.7704 - acc: 0.025 - ETA: 2:14 - loss: 4.7759 - acc: 0.020 - ETA: 2:13 - loss: 4.7839 - acc: 0.025 - ETA: 2:13 - loss: 4.7666 - acc: 0.028 - ETA: 2:12 - loss: 4.7850 - acc: 0.025 - ETA: 2:12 - loss: 4.7856 - acc: 0.022 - ETA: 2:12 - loss: 4.7812 - acc: 0.025 - ETA: 2:12 - loss: 4.7833 - acc: 0.022 - ETA: 2:12 - loss: 4.7775 - acc: 0.020 - ETA: 2:11 - loss: 4.7839 - acc: 0.023 - ETA: 2:12 - loss: 4.7810 - acc: 0.025 - ETA: 2:11 - loss: 4.7697 - acc: 0.023 - ETA: 2:11 - loss: 4.7612 - acc: 0.025 - ETA: 2:10 - loss: 4.7647 - acc: 0.023 - ETA: 2:10 - loss: 4.7661 - acc: 0.022 - ETA: 2:10 - loss: 4.7626 - acc: 0.026 - ETA: 2:09 - loss: 4.7637 - acc: 0.025 - ETA: 2:09 - loss: 4.7639 - acc: 0.023 - ETA: 2:09 - loss: 4.7652 - acc: 0.025 - ETA: 2:08 - loss: 4.7719 - acc: 0.023 - ETA: 2:08 - loss: 4.7761 - acc: 0.022 - ETA: 2:07 - loss: 4.7774 - acc: 0.024 - ETA: 2:07 - loss: 4.7766 - acc: 0.023 - ETA: 2:06 - loss: 4.7731 - acc: 0.024 - ETA: 2:06 - loss: 4.7725 - acc: 0.025 - ETA: 2:05 - loss: 4.7788 - acc: 0.024 - ETA: 2:05 - loss: 4.7724 - acc: 0.023 - ETA: 2:04 - loss: 4.7759 - acc: 0.022 - ETA: 2:04 - loss: 4.7761 - acc: 0.023 - ETA: 2:03 - loss: 4.7757 - acc: 0.024 - ETA: 2:03 - loss: 4.7744 - acc: 0.023 - ETA: 2:02 - loss: 4.7762 - acc: 0.022 - ETA: 2:02 - loss: 4.7752 - acc: 0.022 - ETA: 2:02 - loss: 4.7779 - acc: 0.023 - ETA: 2:01 - loss: 4.7718 - acc: 0.027 - ETA: 2:01 - loss: 4.7752 - acc: 0.026 - ETA: 2:00 - loss: 4.7771 - acc: 0.028 - ETA: 2:00 - loss: 4.7801 - acc: 0.028 - ETA: 2:00 - loss: 4.7768 - acc: 0.027 - ETA: 1:59 - loss: 4.7818 - acc: 0.026 - ETA: 1:59 - loss: 4.7824 - acc: 0.026 - ETA: 1:58 - loss: 4.7801 - acc: 0.025 - ETA: 1:58 - loss: 4.7791 - acc: 0.027 - ETA: 1:57 - loss: 4.7773 - acc: 0.026 - ETA: 1:57 - loss: 4.7797 - acc: 0.026 - ETA: 1:56 - loss: 4.7799 - acc: 0.025 - ETA: 1:56 - loss: 4.7797 - acc: 0.025 - ETA: 1:56 - loss: 4.7797 - acc: 0.024 - ETA: 1:55 - loss: 4.7785 - acc: 0.024 - ETA: 1:55 - loss: 4.7765 - acc: 0.024 - ETA: 1:55 - loss: 4.7739 - acc: 0.025 - ETA: 1:54 - loss: 4.7764 - acc: 0.025 - ETA: 1:54 - loss: 4.7774 - acc: 0.025 - ETA: 1:53 - loss: 4.7760 - acc: 0.024 - ETA: 1:53 - loss: 4.7775 - acc: 0.024 - ETA: 1:52 - loss: 4.7772 - acc: 0.023 - ETA: 1:52 - loss: 4.7756 - acc: 0.024 - ETA: 1:52 - loss: 4.7750 - acc: 0.024 - ETA: 1:51 - loss: 4.7735 - acc: 0.024 - ETA: 1:51 - loss: 4.7731 - acc: 0.023 - ETA: 1:51 - loss: 4.7729 - acc: 0.024 - ETA: 1:50 - loss: 4.7750 - acc: 0.023 - ETA: 1:50 - loss: 4.7737 - acc: 0.023 - ETA: 1:49 - loss: 4.7738 - acc: 0.023 - ETA: 1:49 - loss: 4.7741 - acc: 0.022 - ETA: 1:49 - loss: 4.7741 - acc: 0.022 - ETA: 1:48 - loss: 4.7750 - acc: 0.022 - ETA: 1:48 - loss: 4.7772 - acc: 0.021 - ETA: 1:47 - loss: 4.7794 - acc: 0.021 - ETA: 1:47 - loss: 4.7780 - acc: 0.021 - ETA: 1:46 - loss: 4.7776 - acc: 0.021 - ETA: 1:46 - loss: 4.7778 - acc: 0.021 - ETA: 1:46 - loss: 4.7761 - acc: 0.021 - ETA: 1:45 - loss: 4.7771 - acc: 0.020 - ETA: 1:45 - loss: 4.7782 - acc: 0.020 - ETA: 1:45 - loss: 4.7755 - acc: 0.020 - ETA: 1:44 - loss: 4.7747 - acc: 0.020 - ETA: 1:44 - loss: 4.7754 - acc: 0.019 - ETA: 1:43 - loss: 4.7748 - acc: 0.019 - ETA: 1:43 - loss: 4.7776 - acc: 0.019 - ETA: 1:42 - loss: 4.7790 - acc: 0.019 - ETA: 1:42 - loss: 4.7765 - acc: 0.019 - ETA: 1:42 - loss: 4.7754 - acc: 0.019 - ETA: 1:41 - loss: 4.7744 - acc: 0.019 - ETA: 1:41 - loss: 4.7741 - acc: 0.020 - ETA: 1:40 - loss: 4.7744 - acc: 0.020 - ETA: 1:40 - loss: 4.7749 - acc: 0.020 - ETA: 1:39 - loss: 4.7738 - acc: 0.019 - ETA: 1:39 - loss: 4.7741 - acc: 0.019 - ETA: 1:39 - loss: 4.7760 - acc: 0.019 - ETA: 1:38 - loss: 4.7784 - acc: 0.019 - ETA: 1:38 - loss: 4.7782 - acc: 0.018 - ETA: 1:37 - loss: 4.7778 - acc: 0.019 - ETA: 1:37 - loss: 4.7760 - acc: 0.019 - ETA: 1:37 - loss: 4.7760 - acc: 0.019 - ETA: 1:36 - loss: 4.7762 - acc: 0.019 - ETA: 1:36 - loss: 4.7766 - acc: 0.019 - ETA: 1:35 - loss: 4.7778 - acc: 0.018 - ETA: 1:35 - loss: 4.7787 - acc: 0.018 - ETA: 1:35 - loss: 4.7787 - acc: 0.018 - ETA: 1:34 - loss: 4.7769 - acc: 0.018 - ETA: 1:34 - loss: 4.7775 - acc: 0.018 - ETA: 1:34 - loss: 4.7763 - acc: 0.018 - ETA: 1:33 - loss: 4.7776 - acc: 0.018 - ETA: 1:33 - loss: 4.7786 - acc: 0.018 - ETA: 1:33 - loss: 4.7788 - acc: 0.018 - ETA: 1:32 - loss: 4.7787 - acc: 0.019 - ETA: 1:32 - loss: 4.7798 - acc: 0.019 - ETA: 1:31 - loss: 4.7796 - acc: 0.019 - ETA: 1:31 - loss: 4.7802 - acc: 0.019 - ETA: 1:31 - loss: 4.7802 - acc: 0.018 - ETA: 1:30 - loss: 4.7790 - acc: 0.019 - ETA: 1:30 - loss: 4.7795 - acc: 0.019 - ETA: 1:29 - loss: 4.7810 - acc: 0.019 - ETA: 1:29 - loss: 4.7799 - acc: 0.019 - ETA: 1:29 - loss: 4.7798 - acc: 0.019 - ETA: 1:28 - loss: 4.7799 - acc: 0.020 - ETA: 1:28 - loss: 4.7809 - acc: 0.019 - ETA: 1:27 - loss: 4.7800 - acc: 0.020 - ETA: 1:27 - loss: 4.7802 - acc: 0.020 - ETA: 1:26 - loss: 4.7808 - acc: 0.021 - ETA: 1:26 - loss: 4.7815 - acc: 0.021 - ETA: 1:26 - loss: 4.7813 - acc: 0.022 - ETA: 1:25 - loss: 4.7804 - acc: 0.022 - ETA: 1:25 - loss: 4.7798 - acc: 0.021 - ETA: 1:24 - loss: 4.7796 - acc: 0.022 - ETA: 1:24 - loss: 4.7782 - acc: 0.022 - ETA: 1:24 - loss: 4.7780 - acc: 0.022 - ETA: 1:23 - loss: 4.7776 - acc: 0.022 - ETA: 1:23 - loss: 4.7779 - acc: 0.021 - ETA: 1:22 - loss: 4.7778 - acc: 0.021 - ETA: 1:22 - loss: 4.7803 - acc: 0.021 - ETA: 1:21 - loss: 4.7803 - acc: 0.021 - ETA: 1:21 - loss: 4.7807 - acc: 0.021 - ETA: 1:21 - loss: 4.7808 - acc: 0.021 - ETA: 1:20 - loss: 4.7798 - acc: 0.020 - ETA: 1:20 - loss: 4.7805 - acc: 0.020 - ETA: 1:19 - loss: 4.7818 - acc: 0.020 - ETA: 1:19 - loss: 4.7822 - acc: 0.020 - ETA: 1:19 - loss: 4.7829 - acc: 0.020 - ETA: 1:18 - loss: 4.7833 - acc: 0.020 - ETA: 1:18 - loss: 4.7827 - acc: 0.020 - ETA: 1:17 - loss: 4.7824 - acc: 0.020 - ETA: 1:17 - loss: 4.7819 - acc: 0.020 - ETA: 1:16 - loss: 4.7820 - acc: 0.021 - ETA: 1:16 - loss: 4.7824 - acc: 0.021 - ETA: 1:16 - loss: 4.7813 - acc: 0.021 - ETA: 1:15 - loss: 4.7805 - acc: 0.020 - ETA: 1:15 - loss: 4.7804 - acc: 0.021 - ETA: 1:14 - loss: 4.7803 - acc: 0.020 - ETA: 1:14 - loss: 4.7808 - acc: 0.020 - ETA: 1:13 - loss: 4.7800 - acc: 0.021 - ETA: 1:13 - loss: 4.7790 - acc: 0.020 - ETA: 1:13 - loss: 4.7785 - acc: 0.020 - ETA: 1:12 - loss: 4.7780 - acc: 0.020 - ETA: 1:12 - loss: 4.7775 - acc: 0.020 - ETA: 1:11 - loss: 4.7786 - acc: 0.020 - ETA: 1:11 - loss: 4.7774 - acc: 0.020 - ETA: 1:11 - loss: 4.7779 - acc: 0.020 - ETA: 1:10 - loss: 4.7788 - acc: 0.020 - ETA: 1:10 - loss: 4.7781 - acc: 0.020 - ETA: 1:09 - loss: 4.7784 - acc: 0.020 - ETA: 1:09 - loss: 4.7791 - acc: 0.019 - ETA: 1:08 - loss: 4.7793 - acc: 0.019 - ETA: 1:08 - loss: 4.7792 - acc: 0.019 - ETA: 1:08 - loss: 4.7789 - acc: 0.020 - ETA: 1:07 - loss: 4.7784 - acc: 0.020 - ETA: 1:07 - loss: 4.7781 - acc: 0.020 - ETA: 1:06 - loss: 4.7776 - acc: 0.020 - ETA: 1:06 - loss: 4.7780 - acc: 0.019 - ETA: 1:06 - loss: 4.7777 - acc: 0.019 - ETA: 1:05 - loss: 4.7769 - acc: 0.020 - ETA: 1:05 - loss: 4.7775 - acc: 0.020 - ETA: 1:04 - loss: 4.7772 - acc: 0.020 - ETA: 1:04 - loss: 4.7771 - acc: 0.020 - ETA: 1:04 - loss: 4.7774 - acc: 0.020 - ETA: 1:03 - loss: 4.7782 - acc: 0.020 - ETA: 1:03 - loss: 4.7789 - acc: 0.020 - ETA: 1:02 - loss: 4.7789 - acc: 0.020 - ETA: 1:02 - loss: 4.7793 - acc: 0.020 - ETA: 1:01 - loss: 4.7786 - acc: 0.020 - ETA: 1:01 - loss: 4.7782 - acc: 0.020 - ETA: 1:01 - loss: 4.7771 - acc: 0.020 - ETA: 1:00 - loss: 4.7777 - acc: 0.020 - ETA: 1:00 - loss: 4.7764 - acc: 0.020 - ETA: 59s - loss: 4.7767 - acc: 0.020 - ETA: 59s - loss: 4.7762 - acc: 0.02 - ETA: 59s - loss: 4.7780 - acc: 0.01 - ETA: 58s - loss: 4.7779 - acc: 0.02 - ETA: 58s - loss: 4.7777 - acc: 0.02 - ETA: 57s - loss: 4.7778 - acc: 0.02 - ETA: 57s - loss: 4.7776 - acc: 0.02 - ETA: 56s - loss: 4.7772 - acc: 0.01 - ETA: 56s - loss: 4.7773 - acc: 0.01 - ETA: 56s - loss: 4.7781 - acc: 0.01 - ETA: 55s - loss: 4.7777 - acc: 0.01 - ETA: 55s - loss: 4.7776 - acc: 0.01 - ETA: 54s - loss: 4.7784 - acc: 0.01 - ETA: 54s - loss: 4.7793 - acc: 0.01 - ETA: 54s - loss: 4.7796 - acc: 0.01 - ETA: 53s - loss: 4.7793 - acc: 0.01 - ETA: 53s - loss: 4.7799 - acc: 0.01956680/6680 [==============================] - ETA: 52s - loss: 4.7796 - acc: 0.01 - ETA: 52s - loss: 4.7797 - acc: 0.01 - ETA: 51s - loss: 4.7795 - acc: 0.01 - ETA: 51s - loss: 4.7791 - acc: 0.01 - ETA: 51s - loss: 4.7785 - acc: 0.01 - ETA: 50s - loss: 4.7790 - acc: 0.02 - ETA: 50s - loss: 4.7787 - acc: 0.02 - ETA: 49s - loss: 4.7790 - acc: 0.02 - ETA: 49s - loss: 4.7792 - acc: 0.01 - ETA: 49s - loss: 4.7792 - acc: 0.01 - ETA: 48s - loss: 4.7794 - acc: 0.01 - ETA: 48s - loss: 4.7792 - acc: 0.01 - ETA: 47s - loss: 4.7792 - acc: 0.01 - ETA: 47s - loss: 4.7793 - acc: 0.01 - ETA: 47s - loss: 4.7789 - acc: 0.01 - ETA: 46s - loss: 4.7792 - acc: 0.01 - ETA: 46s - loss: 4.7799 - acc: 0.01 - ETA: 45s - loss: 4.7802 - acc: 0.01 - ETA: 45s - loss: 4.7792 - acc: 0.01 - ETA: 44s - loss: 4.7803 - acc: 0.01 - ETA: 44s - loss: 4.7799 - acc: 0.01 - ETA: 44s - loss: 4.7798 - acc: 0.01 - ETA: 43s - loss: 4.7802 - acc: 0.01 - ETA: 43s - loss: 4.7806 - acc: 0.01 - ETA: 42s - loss: 4.7802 - acc: 0.01 - ETA: 42s - loss: 4.7797 - acc: 0.02 - ETA: 42s - loss: 4.7801 - acc: 0.02 - ETA: 41s - loss: 4.7804 - acc: 0.02 - ETA: 41s - loss: 4.7801 - acc: 0.01 - ETA: 40s - loss: 4.7805 - acc: 0.01 - ETA: 40s - loss: 4.7804 - acc: 0.01 - ETA: 39s - loss: 4.7803 - acc: 0.01 - ETA: 39s - loss: 4.7811 - acc: 0.01 - ETA: 39s - loss: 4.7811 - acc: 0.01 - ETA: 38s - loss: 4.7809 - acc: 0.01 - ETA: 38s - loss: 4.7805 - acc: 0.01 - ETA: 37s - loss: 4.7804 - acc: 0.01 - ETA: 37s - loss: 4.7795 - acc: 0.01 - ETA: 37s - loss: 4.7797 - acc: 0.01 - ETA: 36s - loss: 4.7801 - acc: 0.01 - ETA: 36s - loss: 4.7804 - acc: 0.01 - ETA: 35s - loss: 4.7809 - acc: 0.01 - ETA: 35s - loss: 4.7815 - acc: 0.01 - ETA: 35s - loss: 4.7815 - acc: 0.01 - ETA: 34s - loss: 4.7814 - acc: 0.01 - ETA: 34s - loss: 4.7811 - acc: 0.01 - ETA: 33s - loss: 4.7812 - acc: 0.01 - ETA: 33s - loss: 4.7816 - acc: 0.01 - ETA: 32s - loss: 4.7821 - acc: 0.01 - ETA: 32s - loss: 4.7818 - acc: 0.01 - ETA: 32s - loss: 4.7815 - acc: 0.01 - ETA: 31s - loss: 4.7809 - acc: 0.01 - ETA: 31s - loss: 4.7807 - acc: 0.01 - ETA: 30s - loss: 4.7804 - acc: 0.01 - ETA: 30s - loss: 4.7798 - acc: 0.01 - ETA: 30s - loss: 4.7798 - acc: 0.01 - ETA: 29s - loss: 4.7792 - acc: 0.01 - ETA: 29s - loss: 4.7788 - acc: 0.01 - ETA: 28s - loss: 4.7792 - acc: 0.01 - ETA: 28s - loss: 4.7794 - acc: 0.01 - ETA: 27s - loss: 4.7791 - acc: 0.01 - ETA: 27s - loss: 4.7797 - acc: 0.01 - ETA: 27s - loss: 4.7798 - acc: 0.01 - ETA: 26s - loss: 4.7794 - acc: 0.01 - ETA: 26s - loss: 4.7796 - acc: 0.01 - ETA: 25s - loss: 4.7798 - acc: 0.01 - ETA: 25s - loss: 4.7797 - acc: 0.01 - ETA: 25s - loss: 4.7798 - acc: 0.01 - ETA: 24s - loss: 4.7798 - acc: 0.01 - ETA: 24s - loss: 4.7792 - acc: 0.01 - ETA: 23s - loss: 4.7794 - acc: 0.01 - ETA: 23s - loss: 4.7795 - acc: 0.01 - ETA: 23s - loss: 4.7790 - acc: 0.01 - ETA: 22s - loss: 4.7791 - acc: 0.01 - ETA: 22s - loss: 4.7789 - acc: 0.01 - ETA: 21s - loss: 4.7788 - acc: 0.01 - ETA: 21s - loss: 4.7781 - acc: 0.01 - ETA: 20s - loss: 4.7790 - acc: 0.01 - ETA: 20s - loss: 4.7786 - acc: 0.01 - ETA: 20s - loss: 4.7785 - acc: 0.01 - ETA: 19s - loss: 4.7787 - acc: 0.01 - ETA: 19s - loss: 4.7785 - acc: 0.01 - ETA: 18s - loss: 4.7792 - acc: 0.01 - ETA: 18s - loss: 4.7793 - acc: 0.01 - ETA: 18s - loss: 4.7788 - acc: 0.01 - ETA: 17s - loss: 4.7789 - acc: 0.01 - ETA: 17s - loss: 4.7787 - acc: 0.01 - ETA: 16s - loss: 4.7784 - acc: 0.01 - ETA: 16s - loss: 4.7788 - acc: 0.01 - ETA: 16s - loss: 4.7789 - acc: 0.01 - ETA: 15s - loss: 4.7786 - acc: 0.01 - ETA: 15s - loss: 4.7788 - acc: 0.01 - ETA: 14s - loss: 4.7782 - acc: 0.01 - ETA: 14s - loss: 4.7780 - acc: 0.01 - ETA: 13s - loss: 4.7780 - acc: 0.01 - ETA: 13s - loss: 4.7777 - acc: 0.01 - ETA: 13s - loss: 4.7778 - acc: 0.01 - ETA: 12s - loss: 4.7777 - acc: 0.01 - ETA: 12s - loss: 4.7777 - acc: 0.01 - ETA: 11s - loss: 4.7783 - acc: 0.01 - ETA: 11s - loss: 4.7777 - acc: 0.01 - ETA: 11s - loss: 4.7776 - acc: 0.01 - ETA: 10s - loss: 4.7782 - acc: 0.01 - ETA: 10s - loss: 4.7781 - acc: 0.01 - ETA: 9s - loss: 4.7782 - acc: 0.0181 - ETA: 9s - loss: 4.7786 - acc: 0.018 - ETA: 9s - loss: 4.7787 - acc: 0.017 - ETA: 8s - loss: 4.7786 - acc: 0.017 - ETA: 8s - loss: 4.7785 - acc: 0.017 - ETA: 7s - loss: 4.7783 - acc: 0.017 - ETA: 7s - loss: 4.7782 - acc: 0.017 - ETA: 6s - loss: 4.7782 - acc: 0.017 - ETA: 6s - loss: 4.7784 - acc: 0.017 - ETA: 6s - loss: 4.7777 - acc: 0.018 - ETA: 5s - loss: 4.7781 - acc: 0.018 - ETA: 5s - loss: 4.7780 - acc: 0.018 - ETA: 4s - loss: 4.7783 - acc: 0.018 - ETA: 4s - loss: 4.7781 - acc: 0.018 - ETA: 4s - loss: 4.7782 - acc: 0.018 - ETA: 3s - loss: 4.7783 - acc: 0.018 - ETA: 3s - loss: 4.7784 - acc: 0.018 - ETA: 2s - loss: 4.7781 - acc: 0.018 - ETA: 2s - loss: 4.7778 - acc: 0.018 - ETA: 2s - loss: 4.7777 - acc: 0.018 - ETA: 1s - loss: 4.7776 - acc: 0.018 - ETA: 1s - loss: 4.7778 - acc: 0.018 - ETA: 0s - loss: 4.7778 - acc: 0.018 - ETA: 0s - loss: 4.7782 - acc: 0.018 - 144s 21ms/step - loss: 4.7779 - acc: 0.0184 - val_loss: 4.7848 - val_acc: 0.0240 Epoch 00004: val_loss improved from 4.81981 to 4.78476, saving model to saved_models/weights.best.from_scratch1.hdf5 Epoch 5/5 4080/6680 [=================>............] - ETA: 2:13 - loss: 4.7086 - acc: 0.0000e+0 - ETA: 2:13 - loss: 4.7917 - acc: 0.0000e+0 - ETA: 2:14 - loss: 4.7923 - acc: 0.0000e+0 - ETA: 2:14 - loss: 4.7544 - acc: 0.0000e+0 - ETA: 2:14 - loss: 4.7586 - acc: 0.0000e+0 - ETA: 2:14 - loss: 4.7527 - acc: 0.0000e+0 - ETA: 2:14 - loss: 4.7192 - acc: 0.0000e+0 - ETA: 2:13 - loss: 4.7460 - acc: 0.0000e+0 - ETA: 2:12 - loss: 4.7432 - acc: 0.0000e+0 - ETA: 2:11 - loss: 4.7373 - acc: 0.0050 - ETA: 2:11 - loss: 4.7405 - acc: 0.009 - ETA: 2:11 - loss: 4.7456 - acc: 0.012 - ETA: 2:11 - loss: 4.7491 - acc: 0.019 - ETA: 2:11 - loss: 4.7424 - acc: 0.021 - ETA: 2:10 - loss: 4.7277 - acc: 0.020 - ETA: 2:10 - loss: 4.7245 - acc: 0.025 - ETA: 2:10 - loss: 4.7130 - acc: 0.026 - ETA: 2:10 - loss: 4.7164 - acc: 0.027 - ETA: 2:09 - loss: 4.7149 - acc: 0.026 - ETA: 2:09 - loss: 4.7152 - acc: 0.025 - ETA: 2:08 - loss: 4.7239 - acc: 0.023 - ETA: 2:08 - loss: 4.7214 - acc: 0.022 - ETA: 2:07 - loss: 4.7161 - acc: 0.021 - ETA: 2:07 - loss: 4.7143 - acc: 0.020 - ETA: 2:06 - loss: 4.7103 - acc: 0.020 - ETA: 2:06 - loss: 4.7173 - acc: 0.019 - ETA: 2:05 - loss: 4.7276 - acc: 0.020 - ETA: 2:05 - loss: 4.7265 - acc: 0.019 - ETA: 2:05 - loss: 4.7280 - acc: 0.020 - ETA: 2:04 - loss: 4.7236 - acc: 0.021 - ETA: 2:04 - loss: 4.7227 - acc: 0.021 - ETA: 2:03 - loss: 4.7274 - acc: 0.020 - ETA: 2:03 - loss: 4.7248 - acc: 0.019 - ETA: 2:02 - loss: 4.7250 - acc: 0.020 - ETA: 2:02 - loss: 4.7271 - acc: 0.020 - ETA: 2:02 - loss: 4.7271 - acc: 0.019 - ETA: 2:01 - loss: 4.7327 - acc: 0.018 - ETA: 2:01 - loss: 4.7348 - acc: 0.018 - ETA: 2:01 - loss: 4.7373 - acc: 0.019 - ETA: 2:00 - loss: 4.7374 - acc: 0.018 - ETA: 2:00 - loss: 4.7380 - acc: 0.019 - ETA: 1:59 - loss: 4.7391 - acc: 0.020 - ETA: 1:59 - loss: 4.7382 - acc: 0.020 - ETA: 1:59 - loss: 4.7373 - acc: 0.021 - ETA: 1:58 - loss: 4.7332 - acc: 0.022 - ETA: 1:58 - loss: 4.7385 - acc: 0.021 - ETA: 1:57 - loss: 4.7395 - acc: 0.021 - ETA: 1:57 - loss: 4.7432 - acc: 0.020 - ETA: 1:56 - loss: 4.7404 - acc: 0.021 - ETA: 1:56 - loss: 4.7421 - acc: 0.022 - ETA: 1:56 - loss: 4.7424 - acc: 0.021 - ETA: 1:55 - loss: 4.7393 - acc: 0.021 - ETA: 1:55 - loss: 4.7410 - acc: 0.020 - ETA: 1:54 - loss: 4.7426 - acc: 0.020 - ETA: 1:54 - loss: 4.7448 - acc: 0.020 - ETA: 1:53 - loss: 4.7471 - acc: 0.020 - ETA: 1:53 - loss: 4.7478 - acc: 0.020 - ETA: 1:53 - loss: 4.7456 - acc: 0.019 - ETA: 1:52 - loss: 4.7471 - acc: 0.020 - ETA: 1:52 - loss: 4.7490 - acc: 0.020 - ETA: 1:51 - loss: 4.7501 - acc: 0.020 - ETA: 1:51 - loss: 4.7525 - acc: 0.021 - ETA: 1:51 - loss: 4.7529 - acc: 0.020 - ETA: 1:50 - loss: 4.7506 - acc: 0.020 - ETA: 1:50 - loss: 4.7496 - acc: 0.021 - ETA: 1:49 - loss: 4.7494 - acc: 0.022 - ETA: 1:49 - loss: 4.7503 - acc: 0.021 - ETA: 1:49 - loss: 4.7509 - acc: 0.021 - ETA: 1:48 - loss: 4.7505 - acc: 0.021 - ETA: 1:48 - loss: 4.7489 - acc: 0.022 - ETA: 1:47 - loss: 4.7498 - acc: 0.021 - ETA: 1:47 - loss: 4.7485 - acc: 0.022 - ETA: 1:47 - loss: 4.7480 - acc: 0.022 - ETA: 1:46 - loss: 4.7478 - acc: 0.022 - ETA: 1:46 - loss: 4.7477 - acc: 0.022 - ETA: 1:45 - loss: 4.7462 - acc: 0.022 - ETA: 1:45 - loss: 4.7451 - acc: 0.022 - ETA: 1:45 - loss: 4.7434 - acc: 0.021 - ETA: 1:44 - loss: 4.7447 - acc: 0.021 - ETA: 1:44 - loss: 4.7467 - acc: 0.021 - ETA: 1:44 - loss: 4.7470 - acc: 0.021 - ETA: 1:43 - loss: 4.7472 - acc: 0.020 - ETA: 1:43 - loss: 4.7465 - acc: 0.021 - ETA: 1:43 - loss: 4.7499 - acc: 0.021 - ETA: 1:42 - loss: 4.7495 - acc: 0.021 - ETA: 1:42 - loss: 4.7492 - acc: 0.021 - ETA: 1:42 - loss: 4.7512 - acc: 0.021 - ETA: 1:41 - loss: 4.7518 - acc: 0.021 - ETA: 1:41 - loss: 4.7515 - acc: 0.021 - ETA: 1:40 - loss: 4.7498 - acc: 0.021 - ETA: 1:40 - loss: 4.7484 - acc: 0.021 - ETA: 1:39 - loss: 4.7447 - acc: 0.022 - ETA: 1:39 - loss: 4.7452 - acc: 0.022 - ETA: 1:39 - loss: 4.7444 - acc: 0.021 - ETA: 1:38 - loss: 4.7447 - acc: 0.021 - ETA: 1:38 - loss: 4.7460 - acc: 0.021 - ETA: 1:37 - loss: 4.7469 - acc: 0.021 - ETA: 1:37 - loss: 4.7451 - acc: 0.021 - ETA: 1:36 - loss: 4.7447 - acc: 0.021 - ETA: 1:36 - loss: 4.7465 - acc: 0.021 - ETA: 1:35 - loss: 4.7448 - acc: 0.020 - ETA: 1:35 - loss: 4.7454 - acc: 0.021 - ETA: 1:34 - loss: 4.7443 - acc: 0.021 - ETA: 1:34 - loss: 4.7457 - acc: 0.021 - ETA: 1:34 - loss: 4.7457 - acc: 0.021 - ETA: 1:33 - loss: 4.7462 - acc: 0.021 - ETA: 1:33 - loss: 4.7480 - acc: 0.021 - ETA: 1:32 - loss: 4.7492 - acc: 0.020 - ETA: 1:32 - loss: 4.7497 - acc: 0.021 - ETA: 1:31 - loss: 4.7502 - acc: 0.021 - ETA: 1:31 - loss: 4.7519 - acc: 0.021 - ETA: 1:30 - loss: 4.7520 - acc: 0.021 - ETA: 1:30 - loss: 4.7525 - acc: 0.021 - ETA: 1:30 - loss: 4.7519 - acc: 0.021 - ETA: 1:29 - loss: 4.7507 - acc: 0.020 - ETA: 1:29 - loss: 4.7518 - acc: 0.020 - ETA: 1:28 - loss: 4.7528 - acc: 0.020 - ETA: 1:28 - loss: 4.7535 - acc: 0.020 - ETA: 1:27 - loss: 4.7526 - acc: 0.020 - ETA: 1:27 - loss: 4.7526 - acc: 0.020 - ETA: 1:27 - loss: 4.7528 - acc: 0.020 - ETA: 1:26 - loss: 4.7537 - acc: 0.020 - ETA: 1:26 - loss: 4.7543 - acc: 0.020 - ETA: 1:25 - loss: 4.7543 - acc: 0.021 - ETA: 1:25 - loss: 4.7537 - acc: 0.020 - ETA: 1:24 - loss: 4.7536 - acc: 0.020 - ETA: 1:24 - loss: 4.7542 - acc: 0.020 - ETA: 1:24 - loss: 4.7541 - acc: 0.020 - ETA: 1:23 - loss: 4.7538 - acc: 0.020 - ETA: 1:23 - loss: 4.7542 - acc: 0.020 - ETA: 1:22 - loss: 4.7539 - acc: 0.020 - ETA: 1:22 - loss: 4.7520 - acc: 0.020 - ETA: 1:21 - loss: 4.7513 - acc: 0.020 - ETA: 1:21 - loss: 4.7521 - acc: 0.020 - ETA: 1:21 - loss: 4.7524 - acc: 0.020 - ETA: 1:20 - loss: 4.7527 - acc: 0.019 - ETA: 1:20 - loss: 4.7547 - acc: 0.019 - ETA: 1:19 - loss: 4.7540 - acc: 0.019 - ETA: 1:19 - loss: 4.7544 - acc: 0.019 - ETA: 1:18 - loss: 4.7540 - acc: 0.020 - ETA: 1:18 - loss: 4.7535 - acc: 0.020 - ETA: 1:18 - loss: 4.7537 - acc: 0.020 - ETA: 1:17 - loss: 4.7543 - acc: 0.019 - ETA: 1:17 - loss: 4.7543 - acc: 0.019 - ETA: 1:16 - loss: 4.7545 - acc: 0.019 - ETA: 1:16 - loss: 4.7540 - acc: 0.019 - ETA: 1:15 - loss: 4.7546 - acc: 0.019 - ETA: 1:15 - loss: 4.7555 - acc: 0.019 - ETA: 1:15 - loss: 4.7557 - acc: 0.019 - ETA: 1:14 - loss: 4.7560 - acc: 0.019 - ETA: 1:14 - loss: 4.7546 - acc: 0.019 - ETA: 1:13 - loss: 4.7545 - acc: 0.019 - ETA: 1:13 - loss: 4.7538 - acc: 0.019 - ETA: 1:13 - loss: 4.7544 - acc: 0.019 - ETA: 1:12 - loss: 4.7540 - acc: 0.019 - ETA: 1:12 - loss: 4.7544 - acc: 0.019 - ETA: 1:11 - loss: 4.7541 - acc: 0.019 - ETA: 1:11 - loss: 4.7549 - acc: 0.019 - ETA: 1:11 - loss: 4.7553 - acc: 0.019 - ETA: 1:10 - loss: 4.7535 - acc: 0.020 - ETA: 1:10 - loss: 4.7524 - acc: 0.020 - ETA: 1:09 - loss: 4.7528 - acc: 0.020 - ETA: 1:09 - loss: 4.7531 - acc: 0.020 - ETA: 1:09 - loss: 4.7519 - acc: 0.020 - ETA: 1:08 - loss: 4.7518 - acc: 0.020 - ETA: 1:08 - loss: 4.7520 - acc: 0.020 - ETA: 1:08 - loss: 4.7510 - acc: 0.021 - ETA: 1:07 - loss: 4.7509 - acc: 0.021 - ETA: 1:07 - loss: 4.7507 - acc: 0.021 - ETA: 1:06 - loss: 4.7510 - acc: 0.020 - ETA: 1:06 - loss: 4.7506 - acc: 0.021 - ETA: 1:06 - loss: 4.7516 - acc: 0.021 - ETA: 1:05 - loss: 4.7524 - acc: 0.021 - ETA: 1:05 - loss: 4.7535 - acc: 0.021 - ETA: 1:04 - loss: 4.7533 - acc: 0.020 - ETA: 1:04 - loss: 4.7536 - acc: 0.020 - ETA: 1:04 - loss: 4.7549 - acc: 0.020 - ETA: 1:03 - loss: 4.7542 - acc: 0.020 - ETA: 1:03 - loss: 4.7536 - acc: 0.020 - ETA: 1:02 - loss: 4.7542 - acc: 0.020 - ETA: 1:02 - loss: 4.7544 - acc: 0.020 - ETA: 1:02 - loss: 4.7537 - acc: 0.021 - ETA: 1:01 - loss: 4.7538 - acc: 0.021 - ETA: 1:01 - loss: 4.7543 - acc: 0.021 - ETA: 1:00 - loss: 4.7543 - acc: 0.021 - ETA: 1:00 - loss: 4.7542 - acc: 0.021 - ETA: 59s - loss: 4.7539 - acc: 0.020 - ETA: 59s - loss: 4.7533 - acc: 0.02 - ETA: 59s - loss: 4.7523 - acc: 0.02 - ETA: 58s - loss: 4.7514 - acc: 0.02 - ETA: 58s - loss: 4.7514 - acc: 0.02 - ETA: 57s - loss: 4.7511 - acc: 0.02 - ETA: 57s - loss: 4.7516 - acc: 0.02 - ETA: 57s - loss: 4.7520 - acc: 0.02 - ETA: 56s - loss: 4.7526 - acc: 0.02 - ETA: 56s - loss: 4.7525 - acc: 0.02 - ETA: 55s - loss: 4.7521 - acc: 0.02 - ETA: 55s - loss: 4.7523 - acc: 0.02 - ETA: 55s - loss: 4.7516 - acc: 0.02 - ETA: 54s - loss: 4.7511 - acc: 0.02 - ETA: 54s - loss: 4.7515 - acc: 0.02 - ETA: 53s - loss: 4.7515 - acc: 0.02 - ETA: 53s - loss: 4.7507 - acc: 0.02 - ETA: 53s - loss: 4.7504 - acc: 0.02016680/6680 [==============================] - ETA: 52s - loss: 4.7507 - acc: 0.02 - ETA: 52s - loss: 4.7509 - acc: 0.01 - ETA: 51s - loss: 4.7513 - acc: 0.01 - ETA: 51s - loss: 4.7517 - acc: 0.01 - ETA: 51s - loss: 4.7514 - acc: 0.01 - ETA: 50s - loss: 4.7510 - acc: 0.01 - ETA: 50s - loss: 4.7508 - acc: 0.01 - ETA: 49s - loss: 4.7512 - acc: 0.01 - ETA: 49s - loss: 4.7510 - acc: 0.01 - ETA: 49s - loss: 4.7512 - acc: 0.02 - ETA: 48s - loss: 4.7507 - acc: 0.02 - ETA: 48s - loss: 4.7516 - acc: 0.02 - ETA: 47s - loss: 4.7518 - acc: 0.02 - ETA: 47s - loss: 4.7519 - acc: 0.02 - ETA: 47s - loss: 4.7521 - acc: 0.02 - ETA: 46s - loss: 4.7520 - acc: 0.02 - ETA: 46s - loss: 4.7521 - acc: 0.02 - ETA: 45s - loss: 4.7524 - acc: 0.02 - ETA: 45s - loss: 4.7524 - acc: 0.02 - ETA: 45s - loss: 4.7522 - acc: 0.01 - ETA: 44s - loss: 4.7523 - acc: 0.01 - ETA: 44s - loss: 4.7524 - acc: 0.01 - ETA: 43s - loss: 4.7530 - acc: 0.01 - ETA: 43s - loss: 4.7529 - acc: 0.02 - ETA: 42s - loss: 4.7524 - acc: 0.02 - ETA: 42s - loss: 4.7515 - acc: 0.02 - ETA: 42s - loss: 4.7513 - acc: 0.02 - ETA: 41s - loss: 4.7514 - acc: 0.02 - ETA: 41s - loss: 4.7511 - acc: 0.02 - ETA: 40s - loss: 4.7520 - acc: 0.02 - ETA: 40s - loss: 4.7532 - acc: 0.02 - ETA: 40s - loss: 4.7529 - acc: 0.01 - ETA: 39s - loss: 4.7534 - acc: 0.02 - ETA: 39s - loss: 4.7531 - acc: 0.02 - ETA: 38s - loss: 4.7526 - acc: 0.02 - ETA: 38s - loss: 4.7526 - acc: 0.02 - ETA: 38s - loss: 4.7533 - acc: 0.02 - ETA: 37s - loss: 4.7533 - acc: 0.02 - ETA: 37s - loss: 4.7535 - acc: 0.02 - ETA: 36s - loss: 4.7533 - acc: 0.02 - ETA: 36s - loss: 4.7530 - acc: 0.02 - ETA: 35s - loss: 4.7525 - acc: 0.02 - ETA: 35s - loss: 4.7519 - acc: 0.02 - ETA: 35s - loss: 4.7513 - acc: 0.02 - ETA: 34s - loss: 4.7514 - acc: 0.02 - ETA: 34s - loss: 4.7515 - acc: 0.02 - ETA: 33s - loss: 4.7512 - acc: 0.02 - ETA: 33s - loss: 4.7505 - acc: 0.02 - ETA: 33s - loss: 4.7510 - acc: 0.02 - ETA: 32s - loss: 4.7506 - acc: 0.02 - ETA: 32s - loss: 4.7510 - acc: 0.02 - ETA: 31s - loss: 4.7511 - acc: 0.02 - ETA: 31s - loss: 4.7520 - acc: 0.02 - ETA: 31s - loss: 4.7524 - acc: 0.02 - ETA: 30s - loss: 4.7524 - acc: 0.02 - ETA: 30s - loss: 4.7522 - acc: 0.02 - ETA: 29s - loss: 4.7527 - acc: 0.02 - ETA: 29s - loss: 4.7525 - acc: 0.02 - ETA: 29s - loss: 4.7532 - acc: 0.02 - ETA: 28s - loss: 4.7530 - acc: 0.02 - ETA: 28s - loss: 4.7526 - acc: 0.02 - ETA: 27s - loss: 4.7522 - acc: 0.02 - ETA: 27s - loss: 4.7518 - acc: 0.02 - ETA: 26s - loss: 4.7517 - acc: 0.02 - ETA: 26s - loss: 4.7516 - acc: 0.02 - ETA: 26s - loss: 4.7520 - acc: 0.02 - ETA: 25s - loss: 4.7515 - acc: 0.02 - ETA: 25s - loss: 4.7512 - acc: 0.02 - ETA: 24s - loss: 4.7514 - acc: 0.02 - ETA: 24s - loss: 4.7514 - acc: 0.02 - ETA: 24s - loss: 4.7511 - acc: 0.02 - ETA: 23s - loss: 4.7514 - acc: 0.02 - ETA: 23s - loss: 4.7511 - acc: 0.02 - ETA: 22s - loss: 4.7513 - acc: 0.02 - ETA: 22s - loss: 4.7521 - acc: 0.02 - ETA: 22s - loss: 4.7524 - acc: 0.02 - ETA: 21s - loss: 4.7527 - acc: 0.02 - ETA: 21s - loss: 4.7526 - acc: 0.02 - ETA: 20s - loss: 4.7521 - acc: 0.02 - ETA: 20s - loss: 4.7525 - acc: 0.02 - ETA: 20s - loss: 4.7519 - acc: 0.02 - ETA: 19s - loss: 4.7514 - acc: 0.02 - ETA: 19s - loss: 4.7521 - acc: 0.02 - ETA: 18s - loss: 4.7518 - acc: 0.02 - ETA: 18s - loss: 4.7517 - acc: 0.02 - ETA: 17s - loss: 4.7518 - acc: 0.02 - ETA: 17s - loss: 4.7514 - acc: 0.02 - ETA: 17s - loss: 4.7523 - acc: 0.02 - ETA: 16s - loss: 4.7516 - acc: 0.02 - ETA: 16s - loss: 4.7523 - acc: 0.02 - ETA: 15s - loss: 4.7519 - acc: 0.02 - ETA: 15s - loss: 4.7521 - acc: 0.02 - ETA: 15s - loss: 4.7519 - acc: 0.02 - ETA: 14s - loss: 4.7517 - acc: 0.02 - ETA: 14s - loss: 4.7521 - acc: 0.02 - ETA: 13s - loss: 4.7522 - acc: 0.02 - ETA: 13s - loss: 4.7518 - acc: 0.02 - ETA: 13s - loss: 4.7524 - acc: 0.02 - ETA: 12s - loss: 4.7523 - acc: 0.02 - ETA: 12s - loss: 4.7521 - acc: 0.02 - ETA: 11s - loss: 4.7517 - acc: 0.02 - ETA: 11s - loss: 4.7521 - acc: 0.02 - ETA: 11s - loss: 4.7520 - acc: 0.02 - ETA: 10s - loss: 4.7529 - acc: 0.02 - ETA: 10s - loss: 4.7531 - acc: 0.02 - ETA: 9s - loss: 4.7533 - acc: 0.0219 - ETA: 9s - loss: 4.7526 - acc: 0.022 - ETA: 8s - loss: 4.7528 - acc: 0.022 - ETA: 8s - loss: 4.7527 - acc: 0.022 - ETA: 8s - loss: 4.7529 - acc: 0.022 - ETA: 7s - loss: 4.7527 - acc: 0.021 - ETA: 7s - loss: 4.7529 - acc: 0.022 - ETA: 6s - loss: 4.7524 - acc: 0.021 - ETA: 6s - loss: 4.7523 - acc: 0.022 - ETA: 6s - loss: 4.7525 - acc: 0.022 - ETA: 5s - loss: 4.7524 - acc: 0.022 - ETA: 5s - loss: 4.7523 - acc: 0.022 - ETA: 4s - loss: 4.7528 - acc: 0.022 - ETA: 4s - loss: 4.7526 - acc: 0.022 - ETA: 4s - loss: 4.7527 - acc: 0.021 - ETA: 3s - loss: 4.7523 - acc: 0.022 - ETA: 3s - loss: 4.7525 - acc: 0.022 - ETA: 2s - loss: 4.7531 - acc: 0.022 - ETA: 2s - loss: 4.7529 - acc: 0.022 - ETA: 2s - loss: 4.7533 - acc: 0.022 - ETA: 1s - loss: 4.7539 - acc: 0.022 - ETA: 1s - loss: 4.7537 - acc: 0.022 - ETA: 0s - loss: 4.7529 - acc: 0.022 - ETA: 0s - loss: 4.7524 - acc: 0.023 - 143s 21ms/step - loss: 4.7516 - acc: 0.0234 - val_loss: 4.8298 - val_acc: 0.0311 Epoch 00005: val_loss did not improve
<keras.callbacks.History at 0x1c915bf1f98>
model.load_weights('saved_models/weights.best.from_scratch1.hdf5')
Try out your model on the test dataset of dog images. Ensure that your test accuracy is greater than 1%.
# get index of predicted dog breed for each image in test set
dog_breed_predictions = [np.argmax(model.predict(np.expand_dims(tensor, axis=0))) for tensor in test_tensors]
# report test accuracy
test_accuracy = 100*np.sum(np.array(dog_breed_predictions)==np.argmax(test_targets, axis=1))/len(dog_breed_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
Test accuracy: 1.7943%
bottleneck_features = np.load('bottleneck_features/DogVGG16Data.npz')
train_VGG16 = bottleneck_features['train']
valid_VGG16 = bottleneck_features['valid']
test_VGG16 = bottleneck_features['test']
The model uses the the pre-trained VGG-16 model as a fixed feature extractor, where the last convolutional output of VGG-16 is fed as input to our model. We only add a global average pooling layer and a fully connected layer, where the latter contains one node for each dog category and is equipped with a softmax.
VGG16_model = Sequential()
VGG16_model.add(GlobalAveragePooling2D(input_shape=train_VGG16.shape[1:]))
VGG16_model.add(Dense(133, activation='softmax'))
VGG16_model.summary()
_________________________________________________________________ Layer (type) Output Shape Param # ================================================================= global_average_pooling2d_2 ( (None, 512) 0 _________________________________________________________________ dense_2 (Dense) (None, 133) 68229 ================================================================= Total params: 68,229 Trainable params: 68,229 Non-trainable params: 0 _________________________________________________________________
VGG16_model.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
checkpointer = ModelCheckpoint(filepath='saved_models/weights.best.VGG16.hdf5',
verbose=1, save_best_only=True)
VGG16_model.fit(train_VGG16, train_targets,
validation_data=(valid_VGG16, valid_targets),
epochs=20, batch_size=20, callbacks=[checkpointer], verbose=1)
Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 2:30 - loss: 14.2721 - acc: 0.05 - ETA: 10s - loss: 15.0919 - acc: 0.0167 - ETA: 5s - loss: 14.7685 - acc: 0.020 - ETA: 4s - loss: 14.4628 - acc: 0.02 - ETA: 3s - loss: 14.3350 - acc: 0.03 - ETA: 2s - loss: 14.1434 - acc: 0.04 - ETA: 2s - loss: 13.9573 - acc: 0.05 - ETA: 1s - loss: 13.8697 - acc: 0.05 - ETA: 1s - loss: 13.7527 - acc: 0.06 - ETA: 1s - loss: 13.6605 - acc: 0.06 - ETA: 1s - loss: 13.5666 - acc: 0.07 - ETA: 1s - loss: 13.4283 - acc: 0.07 - ETA: 1s - loss: 13.3183 - acc: 0.07 - ETA: 0s - loss: 13.1975 - acc: 0.08 - ETA: 0s - loss: 13.1331 - acc: 0.08 - ETA: 0s - loss: 13.0422 - acc: 0.08 - ETA: 0s - loss: 12.9020 - acc: 0.09 - ETA: 0s - loss: 12.8151 - acc: 0.09 - ETA: 0s - loss: 12.7291 - acc: 0.10 - ETA: 0s - loss: 12.6611 - acc: 0.10 - ETA: 0s - loss: 12.5688 - acc: 0.11 - ETA: 0s - loss: 12.5076 - acc: 0.11 - ETA: 0s - loss: 12.4117 - acc: 0.12 - ETA: 0s - loss: 12.3463 - acc: 0.12 - 2s 268us/step - loss: 12.3137 - acc: 0.1254 - val_loss: 10.7338 - val_acc: 0.1880 Epoch 00001: val_loss improved from inf to 10.73376, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 1s - loss: 10.4616 - acc: 0.20 - ETA: 1s - loss: 9.4113 - acc: 0.2933 - ETA: 1s - loss: 9.7843 - acc: 0.278 - ETA: 1s - loss: 10.1071 - acc: 0.26 - ETA: 1s - loss: 10.0154 - acc: 0.26 - ETA: 0s - loss: 10.0227 - acc: 0.25 - ETA: 0s - loss: 10.0915 - acc: 0.25 - ETA: 0s - loss: 10.1060 - acc: 0.25 - ETA: 0s - loss: 10.0439 - acc: 0.26 - ETA: 0s - loss: 9.9176 - acc: 0.2678 - ETA: 0s - loss: 9.9331 - acc: 0.268 - ETA: 0s - loss: 9.9377 - acc: 0.270 - ETA: 0s - loss: 9.9691 - acc: 0.269 - ETA: 0s - loss: 9.9201 - acc: 0.273 - ETA: 0s - loss: 9.9236 - acc: 0.273 - ETA: 0s - loss: 9.9393 - acc: 0.273 - ETA: 0s - loss: 9.9031 - acc: 0.276 - ETA: 0s - loss: 9.8762 - acc: 0.277 - ETA: 0s - loss: 9.8801 - acc: 0.278 - ETA: 0s - loss: 9.8773 - acc: 0.279 - ETA: 0s - loss: 9.8483 - acc: 0.281 - ETA: 0s - loss: 9.8118 - acc: 0.284 - ETA: 0s - loss: 9.8045 - acc: 0.284 - ETA: 0s - loss: 9.7694 - acc: 0.287 - 1s 197us/step - loss: 9.7780 - acc: 0.2874 - val_loss: 9.6029 - val_acc: 0.2754 Epoch 00002: val_loss improved from 10.73376 to 9.60287, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 1s - loss: 5.4041 - acc: 0.600 - ETA: 1s - loss: 9.2344 - acc: 0.336 - ETA: 1s - loss: 9.2289 - acc: 0.337 - ETA: 1s - loss: 9.0798 - acc: 0.352 - ETA: 0s - loss: 8.9247 - acc: 0.366 - ETA: 0s - loss: 8.9741 - acc: 0.362 - ETA: 0s - loss: 8.9528 - acc: 0.365 - ETA: 0s - loss: 8.9638 - acc: 0.366 - ETA: 0s - loss: 8.9720 - acc: 0.367 - ETA: 0s - loss: 9.0165 - acc: 0.364 - ETA: 0s - loss: 9.0582 - acc: 0.363 - ETA: 0s - loss: 9.0058 - acc: 0.365 - ETA: 0s - loss: 9.0445 - acc: 0.362 - ETA: 0s - loss: 9.0708 - acc: 0.360 - ETA: 0s - loss: 9.0289 - acc: 0.361 - ETA: 0s - loss: 9.0341 - acc: 0.361 - ETA: 0s - loss: 8.9818 - acc: 0.363 - ETA: 0s - loss: 8.9681 - acc: 0.364 - ETA: 0s - loss: 8.9230 - acc: 0.368 - ETA: 0s - loss: 8.9181 - acc: 0.368 - ETA: 0s - loss: 8.9536 - acc: 0.367 - ETA: 0s - loss: 8.9352 - acc: 0.368 - ETA: 0s - loss: 8.9482 - acc: 0.368 - ETA: 0s - loss: 8.9448 - acc: 0.368 - 1s 194us/step - loss: 8.9349 - acc: 0.3690 - val_loss: 9.0818 - val_acc: 0.3281 Epoch 00003: val_loss improved from 9.60287 to 9.08182, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 1s - loss: 9.6120 - acc: 0.300 - ETA: 1s - loss: 8.8744 - acc: 0.393 - ETA: 1s - loss: 8.6395 - acc: 0.401 - ETA: 1s - loss: 8.7086 - acc: 0.394 - ETA: 1s - loss: 8.5555 - acc: 0.405 - ETA: 0s - loss: 8.4805 - acc: 0.410 - ETA: 0s - loss: 8.4278 - acc: 0.413 - ETA: 0s - loss: 8.4565 - acc: 0.411 - ETA: 0s - loss: 8.5042 - acc: 0.407 - ETA: 0s - loss: 8.5307 - acc: 0.404 - ETA: 0s - loss: 8.4994 - acc: 0.403 - ETA: 0s - loss: 8.4316 - acc: 0.408 - ETA: 0s - loss: 8.4814 - acc: 0.406 - ETA: 0s - loss: 8.4938 - acc: 0.407 - ETA: 0s - loss: 8.4733 - acc: 0.406 - ETA: 0s - loss: 8.4826 - acc: 0.405 - ETA: 0s - loss: 8.4632 - acc: 0.406 - ETA: 0s - loss: 8.4500 - acc: 0.408 - ETA: 0s - loss: 8.4461 - acc: 0.409 - ETA: 0s - loss: 8.4406 - acc: 0.410 - ETA: 0s - loss: 8.4077 - acc: 0.411 - ETA: 0s - loss: 8.3801 - acc: 0.413 - ETA: 0s - loss: 8.3780 - acc: 0.413 - ETA: 0s - loss: 8.3987 - acc: 0.412 - 1s 195us/step - loss: 8.3932 - acc: 0.4127 - val_loss: 8.7837 - val_acc: 0.3701 Epoch 00004: val_loss improved from 9.08182 to 8.78369, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 1s - loss: 8.4664 - acc: 0.400 - ETA: 1s - loss: 8.5724 - acc: 0.423 - ETA: 1s - loss: 8.4718 - acc: 0.422 - ETA: 1s - loss: 8.3596 - acc: 0.434 - ETA: 1s - loss: 8.2564 - acc: 0.443 - ETA: 0s - loss: 8.2402 - acc: 0.441 - ETA: 0s - loss: 8.1372 - acc: 0.448 - ETA: 0s - loss: 8.1557 - acc: 0.448 - ETA: 0s - loss: 8.2048 - acc: 0.443 - ETA: 0s - loss: 8.2348 - acc: 0.442 - ETA: 0s - loss: 8.2062 - acc: 0.443 - ETA: 0s - loss: 8.1865 - acc: 0.445 - ETA: 0s - loss: 8.1405 - acc: 0.446 - ETA: 0s - loss: 8.1298 - acc: 0.448 - ETA: 0s - loss: 8.1413 - acc: 0.447 - ETA: 0s - loss: 8.1506 - acc: 0.447 - ETA: 0s - loss: 8.1672 - acc: 0.444 - ETA: 0s - loss: 8.1701 - acc: 0.444 - ETA: 0s - loss: 8.1615 - acc: 0.445 - ETA: 0s - loss: 8.1554 - acc: 0.445 - ETA: 0s - loss: 8.1435 - acc: 0.447 - ETA: 0s - loss: 8.1179 - acc: 0.448 - ETA: 0s - loss: 8.1233 - acc: 0.448 - 1s 192us/step - loss: 8.1290 - acc: 0.4482 - val_loss: 8.5747 - val_acc: 0.3964 Epoch 00005: val_loss improved from 8.78369 to 8.57473, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 6/20 6680/6680 [==============================] - ETA: 1s - loss: 9.7237 - acc: 0.350 - ETA: 1s - loss: 7.9379 - acc: 0.470 - ETA: 1s - loss: 7.9673 - acc: 0.471 - ETA: 1s - loss: 8.0846 - acc: 0.466 - ETA: 0s - loss: 7.8620 - acc: 0.472 - ETA: 0s - loss: 8.0549 - acc: 0.460 - ETA: 0s - loss: 8.0233 - acc: 0.462 - ETA: 0s - loss: 7.9563 - acc: 0.468 - ETA: 0s - loss: 8.0273 - acc: 0.463 - ETA: 0s - loss: 7.9641 - acc: 0.468 - ETA: 0s - loss: 8.0207 - acc: 0.466 - ETA: 0s - loss: 8.0303 - acc: 0.464 - ETA: 0s - loss: 7.9567 - acc: 0.469 - ETA: 0s - loss: 7.9141 - acc: 0.472 - ETA: 0s - loss: 7.9007 - acc: 0.472 - ETA: 0s - loss: 7.8734 - acc: 0.472 - ETA: 0s - loss: 7.8887 - acc: 0.470 - ETA: 0s - loss: 7.8698 - acc: 0.471 - ETA: 0s - loss: 7.8938 - acc: 0.469 - ETA: 0s - loss: 7.8576 - acc: 0.471 - ETA: 0s - loss: 7.8384 - acc: 0.470 - ETA: 0s - loss: 7.8558 - acc: 0.469 - ETA: 0s - loss: 7.8517 - acc: 0.468 - 1s 191us/step - loss: 7.8495 - acc: 0.4678 - val_loss: 8.1377 - val_acc: 0.4000 Epoch 00006: val_loss improved from 8.57473 to 8.13774, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 1s - loss: 6.1154 - acc: 0.600 - ETA: 1s - loss: 6.8697 - acc: 0.546 - ETA: 1s - loss: 6.8899 - acc: 0.538 - ETA: 1s - loss: 6.9661 - acc: 0.531 - ETA: 1s - loss: 7.0992 - acc: 0.525 - ETA: 0s - loss: 7.1770 - acc: 0.522 - ETA: 0s - loss: 7.2022 - acc: 0.522 - ETA: 0s - loss: 7.2472 - acc: 0.516 - ETA: 0s - loss: 7.3484 - acc: 0.509 - ETA: 0s - loss: 7.3837 - acc: 0.506 - ETA: 0s - loss: 7.3874 - acc: 0.505 - ETA: 0s - loss: 7.4232 - acc: 0.502 - ETA: 0s - loss: 7.4629 - acc: 0.498 - ETA: 0s - loss: 7.4630 - acc: 0.498 - ETA: 0s - loss: 7.4707 - acc: 0.498 - ETA: 0s - loss: 7.4929 - acc: 0.496 - ETA: 0s - loss: 7.4687 - acc: 0.498 - ETA: 0s - loss: 7.5070 - acc: 0.495 - ETA: 0s - loss: 7.5423 - acc: 0.493 - ETA: 0s - loss: 7.5183 - acc: 0.494 - ETA: 0s - loss: 7.4778 - acc: 0.496 - ETA: 0s - loss: 7.4869 - acc: 0.495 - ETA: 0s - loss: 7.4877 - acc: 0.495 - ETA: 0s - loss: 7.4382 - acc: 0.497 - 1s 192us/step - loss: 7.4365 - acc: 0.4972 - val_loss: 7.8811 - val_acc: 0.4287 Epoch 00007: val_loss improved from 8.13774 to 7.88112, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 8/20 6680/6680 [==============================] - ETA: 1s - loss: 8.0596 - acc: 0.500 - ETA: 1s - loss: 7.4724 - acc: 0.513 - ETA: 1s - loss: 7.5796 - acc: 0.500 - ETA: 1s - loss: 7.2022 - acc: 0.525 - ETA: 1s - loss: 7.1524 - acc: 0.525 - ETA: 0s - loss: 7.1505 - acc: 0.522 - ETA: 0s - loss: 7.0928 - acc: 0.525 - ETA: 0s - loss: 7.0579 - acc: 0.529 - ETA: 0s - loss: 7.1168 - acc: 0.527 - ETA: 0s - loss: 7.2208 - acc: 0.521 - ETA: 0s - loss: 7.1528 - acc: 0.524 - ETA: 0s - loss: 7.2124 - acc: 0.520 - ETA: 0s - loss: 7.2324 - acc: 0.518 - ETA: 0s - loss: 7.2717 - acc: 0.516 - ETA: 0s - loss: 7.2201 - acc: 0.519 - ETA: 0s - loss: 7.1960 - acc: 0.522 - ETA: 0s - loss: 7.2160 - acc: 0.519 - ETA: 0s - loss: 7.2073 - acc: 0.519 - ETA: 0s - loss: 7.2330 - acc: 0.517 - ETA: 0s - loss: 7.2650 - acc: 0.515 - ETA: 0s - loss: 7.2504 - acc: 0.516 - ETA: 0s - loss: 7.2056 - acc: 0.519 - ETA: 0s - loss: 7.2027 - acc: 0.518 - ETA: 0s - loss: 7.2124 - acc: 0.517 - 1s 194us/step - loss: 7.1979 - acc: 0.5183 - val_loss: 7.8410 - val_acc: 0.4287 Epoch 00008: val_loss improved from 7.88112 to 7.84102, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 9/20 6680/6680 [==============================] - ETA: 1s - loss: 9.2085 - acc: 0.400 - ETA: 1s - loss: 7.6528 - acc: 0.515 - ETA: 1s - loss: 7.3490 - acc: 0.529 - ETA: 1s - loss: 7.4733 - acc: 0.518 - ETA: 1s - loss: 7.2393 - acc: 0.529 - ETA: 1s - loss: 7.1993 - acc: 0.530 - ETA: 1s - loss: 7.1513 - acc: 0.529 - ETA: 0s - loss: 7.1585 - acc: 0.530 - ETA: 0s - loss: 7.1834 - acc: 0.528 - ETA: 0s - loss: 7.1141 - acc: 0.533 - ETA: 0s - loss: 7.0830 - acc: 0.535 - ETA: 0s - loss: 7.0848 - acc: 0.534 - ETA: 0s - loss: 7.0459 - acc: 0.537 - ETA: 0s - loss: 7.0909 - acc: 0.534 - ETA: 0s - loss: 7.1367 - acc: 0.531 - ETA: 0s - loss: 7.0869 - acc: 0.534 - ETA: 0s - loss: 7.0459 - acc: 0.536 - ETA: 0s - loss: 7.0386 - acc: 0.535 - ETA: 0s - loss: 7.0329 - acc: 0.536 - ETA: 0s - loss: 7.0435 - acc: 0.535 - ETA: 0s - loss: 7.0149 - acc: 0.536 - ETA: 0s - loss: 7.0279 - acc: 0.535 - ETA: 0s - loss: 7.0454 - acc: 0.534 - ETA: 0s - loss: 7.0461 - acc: 0.534 - 1s 197us/step - loss: 7.0358 - acc: 0.5341 - val_loss: 7.5913 - val_acc: 0.4503 Epoch 00009: val_loss improved from 7.84102 to 7.59127, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 10/20 6680/6680 [==============================] - ETA: 1s - loss: 7.2534 - acc: 0.550 - ETA: 1s - loss: 6.7943 - acc: 0.546 - ETA: 1s - loss: 6.9209 - acc: 0.546 - ETA: 1s - loss: 6.7305 - acc: 0.560 - ETA: 1s - loss: 6.5458 - acc: 0.569 - ETA: 0s - loss: 6.6351 - acc: 0.563 - ETA: 0s - loss: 6.7556 - acc: 0.555 - ETA: 0s - loss: 6.7000 - acc: 0.558 - ETA: 0s - loss: 6.7254 - acc: 0.556 - ETA: 0s - loss: 6.7402 - acc: 0.556 - ETA: 0s - loss: 6.7845 - acc: 0.554 - ETA: 0s - loss: 6.8065 - acc: 0.551 - ETA: 0s - loss: 6.7975 - acc: 0.552 - ETA: 0s - loss: 6.7810 - acc: 0.552 - ETA: 0s - loss: 6.7736 - acc: 0.552 - ETA: 0s - loss: 6.8027 - acc: 0.550 - ETA: 0s - loss: 6.8173 - acc: 0.549 - ETA: 0s - loss: 6.8206 - acc: 0.549 - ETA: 0s - loss: 6.8233 - acc: 0.547 - ETA: 0s - loss: 6.7955 - acc: 0.549 - ETA: 0s - loss: 6.7734 - acc: 0.550 - ETA: 0s - loss: 6.8089 - acc: 0.548 - ETA: 0s - loss: 6.8051 - acc: 0.548 - 1s 191us/step - loss: 6.7911 - acc: 0.5500 - val_loss: 7.4312 - val_acc: 0.4623 Epoch 00010: val_loss improved from 7.59127 to 7.43118, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 11/20 6680/6680 [==============================] - ETA: 1s - loss: 5.7201 - acc: 0.600 - ETA: 1s - loss: 6.5800 - acc: 0.568 - ETA: 1s - loss: 6.1944 - acc: 0.593 - ETA: 1s - loss: 6.1949 - acc: 0.592 - ETA: 0s - loss: 6.3934 - acc: 0.580 - ETA: 0s - loss: 6.4247 - acc: 0.578 - ETA: 0s - loss: 6.3852 - acc: 0.581 - ETA: 0s - loss: 6.3577 - acc: 0.583 - ETA: 0s - loss: 6.3374 - acc: 0.586 - ETA: 0s - loss: 6.3876 - acc: 0.583 - ETA: 0s - loss: 6.4770 - acc: 0.576 - ETA: 0s - loss: 6.5332 - acc: 0.573 - ETA: 0s - loss: 6.5748 - acc: 0.571 - ETA: 0s - loss: 6.5226 - acc: 0.575 - ETA: 0s - loss: 6.5456 - acc: 0.574 - ETA: 0s - loss: 6.5107 - acc: 0.576 - ETA: 0s - loss: 6.4861 - acc: 0.578 - ETA: 0s - loss: 6.4693 - acc: 0.579 - ETA: 0s - loss: 6.4614 - acc: 0.579 - ETA: 0s - loss: 6.5058 - acc: 0.576 - ETA: 0s - loss: 6.5475 - acc: 0.574 - ETA: 0s - loss: 6.5709 - acc: 0.572 - ETA: 0s - loss: 6.6125 - acc: 0.569 - 1s 192us/step - loss: 6.5902 - acc: 0.5701 - val_loss: 7.3676 - val_acc: 0.4635 Epoch 00011: val_loss improved from 7.43118 to 7.36763, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 12/20 6680/6680 [==============================] - ETA: 1s - loss: 8.0693 - acc: 0.500 - ETA: 1s - loss: 6.8506 - acc: 0.546 - ETA: 1s - loss: 6.6121 - acc: 0.565 - ETA: 1s - loss: 6.8406 - acc: 0.553 - ETA: 1s - loss: 6.6442 - acc: 0.567 - ETA: 0s - loss: 6.5852 - acc: 0.572 - ETA: 0s - loss: 6.4081 - acc: 0.585 - ETA: 0s - loss: 6.4556 - acc: 0.584 - ETA: 0s - loss: 6.5338 - acc: 0.579 - ETA: 0s - loss: 6.6564 - acc: 0.571 - ETA: 0s - loss: 6.6624 - acc: 0.569 - ETA: 0s - loss: 6.6667 - acc: 0.570 - ETA: 0s - loss: 6.6933 - acc: 0.568 - ETA: 0s - loss: 6.7036 - acc: 0.567 - ETA: 0s - loss: 6.6912 - acc: 0.569 - ETA: 0s - loss: 6.6415 - acc: 0.572 - ETA: 0s - loss: 6.6153 - acc: 0.573 - ETA: 0s - loss: 6.6011 - acc: 0.574 - ETA: 0s - loss: 6.6244 - acc: 0.572 - ETA: 0s - loss: 6.6312 - acc: 0.572 - ETA: 0s - loss: 6.6208 - acc: 0.572 - ETA: 0s - loss: 6.5773 - acc: 0.575 - ETA: 0s - loss: 6.5222 - acc: 0.578 - 1s 191us/step - loss: 6.5097 - acc: 0.5796 - val_loss: 7.1628 - val_acc: 0.4994 Epoch 00012: val_loss improved from 7.36763 to 7.16275, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 13/20 6680/6680 [==============================] - ETA: 1s - loss: 5.6592 - acc: 0.650 - ETA: 1s - loss: 6.3665 - acc: 0.596 - ETA: 1s - loss: 6.0815 - acc: 0.614 - ETA: 1s - loss: 5.9778 - acc: 0.618 - ETA: 0s - loss: 6.0367 - acc: 0.616 - ETA: 0s - loss: 6.3077 - acc: 0.601 - ETA: 0s - loss: 6.3574 - acc: 0.597 - ETA: 0s - loss: 6.3620 - acc: 0.596 - ETA: 0s - loss: 6.4217 - acc: 0.590 - ETA: 0s - loss: 6.4271 - acc: 0.590 - ETA: 0s - loss: 6.4782 - acc: 0.587 - ETA: 0s - loss: 6.5353 - acc: 0.584 - ETA: 0s - loss: 6.5301 - acc: 0.584 - ETA: 0s - loss: 6.4966 - acc: 0.587 - ETA: 0s - loss: 6.4571 - acc: 0.589 - ETA: 0s - loss: 6.4433 - acc: 0.590 - ETA: 0s - loss: 6.4858 - acc: 0.587 - ETA: 0s - loss: 6.5305 - acc: 0.583 - ETA: 0s - loss: 6.5062 - acc: 0.584 - ETA: 0s - loss: 6.4988 - acc: 0.584 - ETA: 0s - loss: 6.5102 - acc: 0.584 - ETA: 0s - loss: 6.4969 - acc: 0.584 - ETA: 0s - loss: 6.4609 - acc: 0.586 - 1s 190us/step - loss: 6.4760 - acc: 0.5858 - val_loss: 7.2293 - val_acc: 0.4838 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 1s - loss: 5.6485 - acc: 0.650 - ETA: 1s - loss: 6.0568 - acc: 0.618 - ETA: 1s - loss: 6.1599 - acc: 0.608 - ETA: 1s - loss: 6.1260 - acc: 0.608 - ETA: 0s - loss: 6.2335 - acc: 0.600 - ETA: 0s - loss: 6.2704 - acc: 0.599 - ETA: 0s - loss: 6.2994 - acc: 0.596 - ETA: 0s - loss: 6.2520 - acc: 0.600 - ETA: 0s - loss: 6.2508 - acc: 0.601 - ETA: 0s - loss: 6.1715 - acc: 0.605 - ETA: 0s - loss: 6.1859 - acc: 0.604 - ETA: 0s - loss: 6.1927 - acc: 0.603 - ETA: 0s - loss: 6.2053 - acc: 0.601 - ETA: 0s - loss: 6.2834 - acc: 0.596 - ETA: 0s - loss: 6.3171 - acc: 0.595 - ETA: 0s - loss: 6.3998 - acc: 0.590 - ETA: 0s - loss: 6.4077 - acc: 0.590 - ETA: 0s - loss: 6.4824 - acc: 0.586 - ETA: 0s - loss: 6.4593 - acc: 0.587 - ETA: 0s - loss: 6.5155 - acc: 0.583 - ETA: 0s - loss: 6.5178 - acc: 0.583 - ETA: 0s - loss: 6.5087 - acc: 0.583 - ETA: 0s - loss: 6.4483 - acc: 0.587 - 1s 193us/step - loss: 6.4445 - acc: 0.5876 - val_loss: 7.1388 - val_acc: 0.5018 Epoch 00014: val_loss improved from 7.16275 to 7.13883, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 15/20 6680/6680 [==============================] - ETA: 1s - loss: 3.2435 - acc: 0.800 - ETA: 1s - loss: 6.6073 - acc: 0.590 - ETA: 1s - loss: 6.3387 - acc: 0.605 - ETA: 1s - loss: 6.4227 - acc: 0.598 - ETA: 1s - loss: 6.3823 - acc: 0.598 - ETA: 0s - loss: 6.2966 - acc: 0.604 - ETA: 0s - loss: 6.3903 - acc: 0.598 - ETA: 0s - loss: 6.3372 - acc: 0.601 - ETA: 0s - loss: 6.3311 - acc: 0.602 - ETA: 0s - loss: 6.2921 - acc: 0.604 - ETA: 0s - loss: 6.2887 - acc: 0.603 - ETA: 0s - loss: 6.3599 - acc: 0.599 - ETA: 0s - loss: 6.3809 - acc: 0.597 - ETA: 0s - loss: 6.3838 - acc: 0.596 - ETA: 0s - loss: 6.3386 - acc: 0.600 - ETA: 0s - loss: 6.3348 - acc: 0.600 - ETA: 0s - loss: 6.3686 - acc: 0.597 - ETA: 0s - loss: 6.3882 - acc: 0.595 - ETA: 0s - loss: 6.3879 - acc: 0.595 - ETA: 0s - loss: 6.3932 - acc: 0.594 - ETA: 0s - loss: 6.4140 - acc: 0.592 - ETA: 0s - loss: 6.4123 - acc: 0.592 - ETA: 0s - loss: 6.3986 - acc: 0.593 - 1s 191us/step - loss: 6.4085 - acc: 0.5927 - val_loss: 7.1229 - val_acc: 0.5006 Epoch 00015: val_loss improved from 7.13883 to 7.12291, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 16/20 6680/6680 [==============================] - ETA: 1s - loss: 7.2534 - acc: 0.550 - ETA: 1s - loss: 6.4902 - acc: 0.596 - ETA: 1s - loss: 6.5894 - acc: 0.588 - ETA: 1s - loss: 6.8383 - acc: 0.571 - ETA: 0s - loss: 6.7827 - acc: 0.575 - ETA: 0s - loss: 6.6665 - acc: 0.581 - ETA: 0s - loss: 6.5502 - acc: 0.586 - ETA: 0s - loss: 6.4821 - acc: 0.589 - ETA: 0s - loss: 6.4402 - acc: 0.592 - ETA: 0s - loss: 6.4393 - acc: 0.593 - ETA: 0s - loss: 6.4309 - acc: 0.593 - ETA: 0s - loss: 6.4278 - acc: 0.593 - ETA: 0s - loss: 6.3965 - acc: 0.595 - ETA: 0s - loss: 6.3352 - acc: 0.598 - ETA: 0s - loss: 6.3778 - acc: 0.594 - ETA: 0s - loss: 6.3445 - acc: 0.596 - ETA: 0s - loss: 6.3478 - acc: 0.595 - ETA: 0s - loss: 6.3415 - acc: 0.594 - ETA: 0s - loss: 6.3386 - acc: 0.594 - ETA: 0s - loss: 6.3001 - acc: 0.595 - ETA: 0s - loss: 6.3007 - acc: 0.594 - ETA: 0s - loss: 6.3169 - acc: 0.593 - ETA: 0s - loss: 6.2955 - acc: 0.593 - 1s 190us/step - loss: 6.2865 - acc: 0.5940 - val_loss: 7.0024 - val_acc: 0.4922 Epoch 00016: val_loss improved from 7.12291 to 7.00242, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 17/20 6680/6680 [==============================] - ETA: 1s - loss: 8.8878 - acc: 0.450 - ETA: 1s - loss: 5.9548 - acc: 0.616 - ETA: 1s - loss: 6.1258 - acc: 0.601 - ETA: 1s - loss: 6.1063 - acc: 0.605 - ETA: 0s - loss: 6.1885 - acc: 0.601 - ETA: 0s - loss: 6.2115 - acc: 0.600 - ETA: 0s - loss: 6.2711 - acc: 0.597 - ETA: 0s - loss: 6.2434 - acc: 0.599 - ETA: 0s - loss: 6.2801 - acc: 0.597 - ETA: 0s - loss: 6.1265 - acc: 0.606 - ETA: 0s - loss: 6.1243 - acc: 0.605 - ETA: 0s - loss: 6.0925 - acc: 0.607 - ETA: 0s - loss: 6.0343 - acc: 0.610 - ETA: 0s - loss: 6.0362 - acc: 0.610 - ETA: 0s - loss: 6.0678 - acc: 0.607 - ETA: 0s - loss: 6.0597 - acc: 0.607 - ETA: 0s - loss: 6.0820 - acc: 0.606 - ETA: 0s - loss: 6.0846 - acc: 0.606 - ETA: 0s - loss: 6.1102 - acc: 0.602 - ETA: 0s - loss: 6.1301 - acc: 0.600 - ETA: 0s - loss: 6.1178 - acc: 0.601 - ETA: 0s - loss: 6.0878 - acc: 0.603 - ETA: 0s - loss: 6.0582 - acc: 0.604 - ETA: 0s - loss: 6.0632 - acc: 0.604 - 1s 193us/step - loss: 6.0604 - acc: 0.6048 - val_loss: 6.8912 - val_acc: 0.4946 Epoch 00017: val_loss improved from 7.00242 to 6.89121, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 18/20 6680/6680 [==============================] - ETA: 1s - loss: 8.8650 - acc: 0.450 - ETA: 1s - loss: 6.0861 - acc: 0.600 - ETA: 1s - loss: 5.8410 - acc: 0.617 - ETA: 1s - loss: 5.9422 - acc: 0.616 - ETA: 0s - loss: 5.9870 - acc: 0.615 - ETA: 0s - loss: 5.9014 - acc: 0.621 - ETA: 0s - loss: 6.0502 - acc: 0.611 - ETA: 0s - loss: 6.0750 - acc: 0.609 - ETA: 0s - loss: 5.9990 - acc: 0.614 - ETA: 0s - loss: 5.9784 - acc: 0.615 - ETA: 0s - loss: 5.9933 - acc: 0.613 - ETA: 0s - loss: 5.9547 - acc: 0.616 - ETA: 0s - loss: 5.9447 - acc: 0.617 - ETA: 0s - loss: 5.9200 - acc: 0.619 - ETA: 0s - loss: 5.8916 - acc: 0.620 - ETA: 0s - loss: 5.8751 - acc: 0.620 - ETA: 0s - loss: 5.8512 - acc: 0.622 - ETA: 0s - loss: 5.8650 - acc: 0.621 - ETA: 0s - loss: 5.8719 - acc: 0.621 - ETA: 0s - loss: 5.8617 - acc: 0.622 - ETA: 0s - loss: 5.8598 - acc: 0.623 - ETA: 0s - loss: 5.8682 - acc: 0.622 - ETA: 0s - loss: 5.9012 - acc: 0.620 - 1s 191us/step - loss: 5.9281 - acc: 0.6187 - val_loss: 6.8067 - val_acc: 0.5138 Epoch 00018: val_loss improved from 6.89121 to 6.80675, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 19/20 6680/6680 [==============================] - ETA: 1s - loss: 7.2839 - acc: 0.550 - ETA: 1s - loss: 5.9803 - acc: 0.618 - ETA: 1s - loss: 5.8623 - acc: 0.630 - ETA: 1s - loss: 5.6428 - acc: 0.640 - ETA: 0s - loss: 5.6608 - acc: 0.641 - ETA: 0s - loss: 5.5567 - acc: 0.646 - ETA: 0s - loss: 5.6398 - acc: 0.640 - ETA: 0s - loss: 5.7054 - acc: 0.637 - ETA: 0s - loss: 5.8445 - acc: 0.629 - ETA: 0s - loss: 5.8391 - acc: 0.629 - ETA: 0s - loss: 5.8435 - acc: 0.627 - ETA: 0s - loss: 5.8683 - acc: 0.626 - ETA: 0s - loss: 5.9032 - acc: 0.624 - ETA: 0s - loss: 5.8641 - acc: 0.627 - ETA: 0s - loss: 5.8027 - acc: 0.630 - ETA: 0s - loss: 5.8410 - acc: 0.628 - ETA: 0s - loss: 5.7926 - acc: 0.631 - ETA: 0s - loss: 5.7994 - acc: 0.631 - ETA: 0s - loss: 5.8354 - acc: 0.628 - ETA: 0s - loss: 5.8378 - acc: 0.628 - ETA: 0s - loss: 5.8374 - acc: 0.628 - ETA: 0s - loss: 5.8478 - acc: 0.628 - ETA: 0s - loss: 5.8560 - acc: 0.627 - 1s 192us/step - loss: 5.8843 - acc: 0.6263 - val_loss: 6.7658 - val_acc: 0.5186 Epoch 00019: val_loss improved from 6.80675 to 6.76579, saving model to saved_models/weights.best.VGG16.hdf5 Epoch 20/20 6680/6680 [==============================] - ETA: 1s - loss: 6.4474 - acc: 0.600 - ETA: 1s - loss: 5.9951 - acc: 0.628 - ETA: 1s - loss: 5.9228 - acc: 0.630 - ETA: 1s - loss: 5.7992 - acc: 0.635 - ETA: 0s - loss: 5.8439 - acc: 0.632 - ETA: 0s - loss: 5.8707 - acc: 0.630 - ETA: 0s - loss: 6.0073 - acc: 0.621 - ETA: 0s - loss: 6.0880 - acc: 0.614 - ETA: 0s - loss: 6.0256 - acc: 0.618 - ETA: 0s - loss: 6.0201 - acc: 0.619 - ETA: 0s - loss: 6.0126 - acc: 0.619 - ETA: 0s - loss: 5.9990 - acc: 0.621 - ETA: 0s - loss: 5.9622 - acc: 0.624 - ETA: 0s - loss: 5.9953 - acc: 0.621 - ETA: 0s - loss: 5.9952 - acc: 0.621 - ETA: 0s - loss: 6.0116 - acc: 0.620 - ETA: 0s - loss: 5.9745 - acc: 0.622 - ETA: 0s - loss: 5.9474 - acc: 0.624 - ETA: 0s - loss: 5.9261 - acc: 0.626 - ETA: 0s - loss: 5.8987 - acc: 0.628 - ETA: 0s - loss: 5.8863 - acc: 0.629 - ETA: 0s - loss: 5.8647 - acc: 0.630 - ETA: 0s - loss: 5.8679 - acc: 0.630 - 1s 189us/step - loss: 5.8700 - acc: 0.6301 - val_loss: 6.7570 - val_acc: 0.5066 Epoch 00020: val_loss improved from 6.76579 to 6.75696, saving model to saved_models/weights.best.VGG16.hdf5
<keras.callbacks.History at 0x1c9359efba8>
VGG16_model.load_weights('saved_models/weights.best.VGG16.hdf5')
Now, we can use the CNN to test how well it identifies breed within our test dataset of dog images. We print the test accuracy below.
# get index of predicted dog breed for each image in test set
VGG16_predictions = [np.argmax(VGG16_model.predict(np.expand_dims(feature, axis=0))) for feature in test_VGG16]
# report test accuracy
test_accuracy = 100*np.sum(np.array(VGG16_predictions)==np.argmax(test_targets, axis=1))/len(VGG16_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
Test accuracy: 49.8804%
from extract_bottleneck_features import *
def VGG16_predict_breed(img_path):
# extract bottleneck features
bottleneck_feature = extract_VGG16(path_to_tensor(img_path))
# obtain predicted vector
predicted_vector = VGG16_model.predict(bottleneck_feature)
# return dog breed that is predicted by the model
return dog_names[np.argmax(predicted_vector)]
You will now use transfer learning to create a CNN that can identify dog breed from images. Your CNN must attain at least 60% accuracy on the test set.
In Step 4, we used transfer learning to create a CNN using VGG-16 bottleneck features. In this section, you must use the bottleneck features from a different pre-trained model. To make things easier for you, we have pre-computed the features for all of the networks that are currently available in Keras:
The files are encoded as such:
Dog{network}Data.npz
where {network}, in the above filename, can be one of VGG19, Resnet50, InceptionV3, or Xception. Pick one of the above architectures, download the corresponding bottleneck features, and store the downloaded file in the bottleneck_features/ folder in the repository.
In the code block below, extract the bottleneck features corresponding to the train, test, and validation sets by running the following:
bottleneck_features = np.load('bottleneck_features/Dog{network}Data.npz')
train_{network} = bottleneck_features['train']
valid_{network} = bottleneck_features['valid']
test_{network} = bottleneck_features['test']
### TODO: Obtain bottleneck features from another pre-trained CNN.
# VGG19 bottleneck
bottleneck_features = np.load('bottleneck_features/DogVGG19Data.npz')
train_VGG19 = bottleneck_features['train']
valid_VGG19 = bottleneck_features['valid']
test_VGG19 = bottleneck_features['test']
# Resnet50 bottleneck
bottleneck_features = np.load('bottleneck_features/DogResnet50Data.npz')
train_Resnet50 = bottleneck_features['train']
valid_Resnet50 = bottleneck_features['valid']
test_Resnet50 = bottleneck_features['test']
# InceptionV3 bottleneck
bottleneck_features = np.load('bottleneck_features/DogInceptionV3Data.npz')
train_InceptionV3 = bottleneck_features['train']
valid_InceptionV3 = bottleneck_features['valid']
test_InceptionV3 = bottleneck_features['test']
# Xception bottleneck
bottleneck_features = np.load('bottleneck_features/DogXceptionData.npz')
train_Xception = bottleneck_features['train']
valid_Xception = bottleneck_features['valid']
test_Xception = bottleneck_features['test']
Create a CNN to classify dog breed. At the end of your code cell block, summarize the layers of your model by executing the line:
<your model's name>.summary()
Question 5: Outline the steps you took to get to your final CNN architecture and your reasoning at each step. Describe why you think the architecture is suitable for the current problem.
Answer: we know that we have a small data set and our classification is similar to the one in imagenet. so like we learned in the course we need to remove the top of the classisfication of the module and add our classification nodes. this is what i have done with all 4 bottleneck features and tested it to see who gets the results. Xception_model2 is the model i picked since after all the test it got the highest test score (85%) and the rest got lower. i left all the other models to show the % each of them got.
nm_classes = 133
### Define VGG19 architecture.
VGG19_model = Sequential()
VGG19_model.add(Flatten(input_shape=train_VGG19.shape[1:]))
VGG19_model.add(Dense(nm_classes, activation='softmax'))
VGG19_model.summary()
VGG19_model1 = Sequential()
VGG19_model1.add(AveragePooling2D(input_shape=train_VGG19.shape[1:]))
VGG19_model1.add(Flatten())
VGG19_model1.add(Dense(nm_classes, activation='softmax'))
VGG19_model1.summary()
VGG19_model2 = Sequential()
VGG19_model2.add(GlobalAveragePooling2D(input_shape=train_VGG19.shape[1:]))
VGG19_model2.add(Dense(nm_classes, activation='softmax'))
VGG19_model2.summary()
VGG19_model3 = Sequential()
VGG19_model3.add(GlobalMaxPooling2D(input_shape=train_VGG19.shape[1:]))
VGG19_model3.add(Dense(nm_classes, activation='softmax'))
VGG19_model3.summary()
VGG19_model4 = Sequential()
VGG19_model4.add(MaxPooling2D(input_shape=train_VGG19.shape[1:]))
VGG19_model4.add(Flatten())
VGG19_model4.add(Dense(nm_classes, activation='softmax'))
VGG19_model4.summary()
### Define InceptionV3 architecture.
InceptionV3_model = Sequential()
InceptionV3_model.add(Flatten(input_shape=train_InceptionV3.shape[1:]))
InceptionV3_model.add(Dense(nm_classes, activation='softmax'))
InceptionV3_model.summary()
InceptionV3_model1 = Sequential()
InceptionV3_model1.add(AveragePooling2D(input_shape=train_InceptionV3.shape[1:]))
InceptionV3_model1.add(Flatten())
InceptionV3_model1.add(Dense(nm_classes, activation='softmax'))
InceptionV3_model1.summary()
InceptionV3_model2 = Sequential()
InceptionV3_model2.add(GlobalAveragePooling2D(input_shape=train_InceptionV3.shape[1:]))
InceptionV3_model2.add(Dense(nm_classes, activation='softmax'))
InceptionV3_model2.summary()
InceptionV3_model3 = Sequential()
InceptionV3_model3.add(GlobalMaxPooling2D(input_shape=train_InceptionV3.shape[1:]))
InceptionV3_model3.add(Dense(nm_classes, activation='softmax'))
InceptionV3_model3.summary()
InceptionV3_model4 = Sequential()
InceptionV3_model4.add(MaxPooling2D(input_shape=train_InceptionV3.shape[1:]))
InceptionV3_model4.add(Flatten())
InceptionV3_model4.add(Dense(nm_classes, activation='softmax'))
InceptionV3_model4.summary()
### Define Resnet50 architecture.
Resnet50_model = Sequential()
Resnet50_model.add(Flatten(input_shape=train_Resnet50.shape[1:]))
Resnet50_model.add(Dense(nm_classes, activation='softmax'))
Resnet50_model.summary()
#Resnet50_model1 = Sequential()
#Resnet50_model1.add(AveragePooling2D(input_shape=train_Resnet50.shape[1:]))
#Resnet50_model1.add(Flatten())
#Resnet50_model1.add(Dense(nm_classes, activation='softmax'))
#Resnet50_model1.summary()
Resnet50_model2 = Sequential()
Resnet50_model2.add(GlobalAveragePooling2D(input_shape=train_Resnet50.shape[1:]))
Resnet50_model2.add(Dense(nm_classes, activation='softmax'))
Resnet50_model2.summary()
Resnet50_model3 = Sequential()
Resnet50_model3.add(GlobalMaxPooling2D(input_shape=train_Resnet50.shape[1:]))
Resnet50_model3.add(Dense(nm_classes, activation='softmax'))
Resnet50_model3.summary()
#Resnet50_model4 = Sequential()
#Resnet50_model4.add(MaxPooling2D(input_shape=train_Resnet50.shape[1:]))
#Resnet50_model4.add(Flatten())
#Resnet50_model4.add(Dense(nm_classes, activation='softmax'))
#Resnet50_model4.summary()
### Define Xception architecture.
Xception_model = Sequential()
Xception_model.add(Flatten(input_shape=train_Xception.shape[1:]))
Xception_model.add(Dense(nm_classes, activation='softmax'))
Xception_model.summary()
Xception_model1 = Sequential()
Xception_model1.add(AveragePooling2D(input_shape=train_Xception.shape[1:]))
Xception_model1.add(Flatten())
Xception_model1.add(Dense(nm_classes, activation='softmax'))
Xception_model1.summary()
Xception_model2 = Sequential()
Xception_model2.add(GlobalAveragePooling2D(input_shape=train_Xception.shape[1:]))
Xception_model2.add(Dense(nm_classes, activation='softmax'))
Xception_model2.summary()
Xception_model3 = Sequential()
Xception_model3.add(GlobalMaxPooling2D(input_shape=train_Xception.shape[1:]))
Xception_model3.add(Dense(nm_classes, activation='softmax'))
Xception_model3.summary()
Xception_model4 = Sequential()
Xception_model4.add(MaxPooling2D(input_shape=train_Xception.shape[1:]))
Xception_model4.add(Flatten())
Xception_model4.add(Dense(nm_classes, activation='softmax'))
Xception_model4.summary()
_________________________________________________________________ Layer (type) Output Shape Param # ================================================================= flatten_2 (Flatten) (None, 25088) 0 _________________________________________________________________ dense_3 (Dense) (None, 133) 3336837 ================================================================= Total params: 3,336,837 Trainable params: 3,336,837 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= average_pooling2d_1 (Average (None, 3, 3, 512) 0 _________________________________________________________________ flatten_3 (Flatten) (None, 4608) 0 _________________________________________________________________ dense_4 (Dense) (None, 133) 612997 ================================================================= Total params: 612,997 Trainable params: 612,997 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= global_average_pooling2d_3 ( (None, 512) 0 _________________________________________________________________ dense_5 (Dense) (None, 133) 68229 ================================================================= Total params: 68,229 Trainable params: 68,229 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= global_max_pooling2d_1 (Glob (None, 512) 0 _________________________________________________________________ dense_6 (Dense) (None, 133) 68229 ================================================================= Total params: 68,229 Trainable params: 68,229 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= max_pooling2d_5 (MaxPooling2 (None, 3, 3, 512) 0 _________________________________________________________________ flatten_4 (Flatten) (None, 4608) 0 _________________________________________________________________ dense_7 (Dense) (None, 133) 612997 ================================================================= Total params: 612,997 Trainable params: 612,997 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= flatten_5 (Flatten) (None, 51200) 0 _________________________________________________________________ dense_8 (Dense) (None, 133) 6809733 ================================================================= Total params: 6,809,733 Trainable params: 6,809,733 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= average_pooling2d_2 (Average (None, 2, 2, 2048) 0 _________________________________________________________________ flatten_6 (Flatten) (None, 8192) 0 _________________________________________________________________ dense_9 (Dense) (None, 133) 1089669 ================================================================= Total params: 1,089,669 Trainable params: 1,089,669 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= global_average_pooling2d_4 ( (None, 2048) 0 _________________________________________________________________ dense_10 (Dense) (None, 133) 272517 ================================================================= Total params: 272,517 Trainable params: 272,517 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= global_max_pooling2d_2 (Glob (None, 2048) 0 _________________________________________________________________ dense_11 (Dense) (None, 133) 272517 ================================================================= Total params: 272,517 Trainable params: 272,517 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= max_pooling2d_6 (MaxPooling2 (None, 2, 2, 2048) 0 _________________________________________________________________ flatten_7 (Flatten) (None, 8192) 0 _________________________________________________________________ dense_12 (Dense) (None, 133) 1089669 ================================================================= Total params: 1,089,669 Trainable params: 1,089,669 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= flatten_8 (Flatten) (None, 2048) 0 _________________________________________________________________ dense_13 (Dense) (None, 133) 272517 ================================================================= Total params: 272,517 Trainable params: 272,517 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= global_average_pooling2d_5 ( (None, 2048) 0 _________________________________________________________________ dense_14 (Dense) (None, 133) 272517 ================================================================= Total params: 272,517 Trainable params: 272,517 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= global_max_pooling2d_3 (Glob (None, 2048) 0 _________________________________________________________________ dense_15 (Dense) (None, 133) 272517 ================================================================= Total params: 272,517 Trainable params: 272,517 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= flatten_9 (Flatten) (None, 100352) 0 _________________________________________________________________ dense_16 (Dense) (None, 133) 13346949 ================================================================= Total params: 13,346,949 Trainable params: 13,346,949 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= average_pooling2d_3 (Average (None, 3, 3, 2048) 0 _________________________________________________________________ flatten_10 (Flatten) (None, 18432) 0 _________________________________________________________________ dense_17 (Dense) (None, 133) 2451589 ================================================================= Total params: 2,451,589 Trainable params: 2,451,589 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= global_average_pooling2d_6 ( (None, 2048) 0 _________________________________________________________________ dense_18 (Dense) (None, 133) 272517 ================================================================= Total params: 272,517 Trainable params: 272,517 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= global_max_pooling2d_4 (Glob (None, 2048) 0 _________________________________________________________________ dense_19 (Dense) (None, 133) 272517 ================================================================= Total params: 272,517 Trainable params: 272,517 Non-trainable params: 0 _________________________________________________________________ _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= max_pooling2d_7 (MaxPooling2 (None, 3, 3, 2048) 0 _________________________________________________________________ flatten_11 (Flatten) (None, 18432) 0 _________________________________________________________________ dense_20 (Dense) (None, 133) 2451589 ================================================================= Total params: 2,451,589 Trainable params: 2,451,589 Non-trainable params: 0 _________________________________________________________________
### Compile the model.
VGG19_model.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
VGG19_model1.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
VGG19_model2.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
VGG19_model3.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
VGG19_model4.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
InceptionV3_model.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
InceptionV3_model1.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
InceptionV3_model2.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
InceptionV3_model3.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
InceptionV3_model4.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
Resnet50_model.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
#Resnet50_model1.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
Resnet50_model2.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
Resnet50_model3.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
#Resnet50_model4.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
Xception_model.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
Xception_model1.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
Xception_model2.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
Xception_model3.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
Xception_model4.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
Train your model in the code cell below. Use model checkpointing to save the model that attains the best validation loss.
You are welcome to augment the training data, but this is not a requirement.
### TODO: Train the model.
# VGG19 part
print("we are at VGG19_model")
VGG19_checkpointer = ModelCheckpoint(filepath='saved_models/weights.best.VGG19.hdf5',
verbose=1, save_best_only=True)
VGG19_model.fit(train_VGG19, train_targets,
validation_data=(valid_VGG19, valid_targets),
epochs=20, batch_size=100, callbacks=[VGG19_checkpointer], verbose=1)
print("we are at VGG19_model1")
VGG19_checkpointer1 = ModelCheckpoint(filepath='saved_models/weights.best.VGG191.hdf5',
verbose=1, save_best_only=True)
VGG19_model1.fit(train_VGG19, train_targets,
validation_data=(valid_VGG19, valid_targets),
epochs=20, batch_size=100, callbacks=[VGG19_checkpointer1], verbose=1)
print("we are at VGG19_model2")
VGG19_checkpointer2 = ModelCheckpoint(filepath='saved_models/weights.best.VGG192.hdf5',
verbose=1, save_best_only=True)
VGG19_model2.fit(train_VGG19, train_targets,
validation_data=(valid_VGG19, valid_targets),
epochs=20, batch_size=100, callbacks=[VGG19_checkpointer2], verbose=1)
print("we are at VGG19_model3")
VGG19_checkpointer3 = ModelCheckpoint(filepath='saved_models/weights.best.VGG193.hdf5',
verbose=1, save_best_only=True)
VGG19_model3.fit(train_VGG19, train_targets,
validation_data=(valid_VGG19, valid_targets),
epochs=20, batch_size=100, callbacks=[VGG19_checkpointer3], verbose=1)
print("we are at VGG19_model4")
VGG19_checkpointer4 = ModelCheckpoint(filepath='saved_models/weights.best.VGG194.hdf5',
verbose=1, save_best_only=True)
VGG19_model4.fit(train_VGG19, train_targets,
validation_data=(valid_VGG19, valid_targets),
epochs=20, batch_size=100, callbacks=[VGG19_checkpointer4], verbose=1)
# InceptionV3 part
print("we are at InceptionV3_model")
InceptionV3_checkpointer = ModelCheckpoint(filepath='saved_models/weights.best.InceptionV3.hdf5',
verbose=1, save_best_only=True)
InceptionV3_model.fit(train_InceptionV3, train_targets,
validation_data=(valid_InceptionV3, valid_targets),
epochs=20, batch_size=100, callbacks=[InceptionV3_checkpointer], verbose=1)
print("we are at InceptionV3_model1")
InceptionV3_checkpointer1 = ModelCheckpoint(filepath='saved_models/weights.best.InceptionV31.hdf5',
verbose=1, save_best_only=True)
InceptionV3_model1.fit(train_InceptionV3, train_targets,
validation_data=(valid_InceptionV3, valid_targets),
epochs=20, batch_size=100, callbacks=[InceptionV3_checkpointer1], verbose=1)
print("we are at InceptionV3_model2")
InceptionV3_checkpointer2 = ModelCheckpoint(filepath='saved_models/weights.best.InceptionV32.hdf5',
verbose=1, save_best_only=True)
InceptionV3_model2.fit(train_InceptionV3, train_targets,
validation_data=(valid_InceptionV3, valid_targets),
epochs=20, batch_size=100, callbacks=[InceptionV3_checkpointer2], verbose=1)
print("we are at InceptionV3_model3")
InceptionV3_checkpointer3 = ModelCheckpoint(filepath='saved_models/weights.best.InceptionV33.hdf5',
verbose=1, save_best_only=True)
InceptionV3_model3.fit(train_InceptionV3, train_targets,
validation_data=(valid_InceptionV3, valid_targets),
epochs=20, batch_size=100, callbacks=[InceptionV3_checkpointer3], verbose=1)
print("we are at InceptionV3_model4")
InceptionV3_checkpointer4 = ModelCheckpoint(filepath='saved_models/weights.best.InceptionV34.hdf5',
verbose=1, save_best_only=True)
InceptionV3_model4.fit(train_InceptionV3, train_targets,
validation_data=(valid_InceptionV3, valid_targets),
epochs=20, batch_size=100, callbacks=[InceptionV3_checkpointer4], verbose=1)
# Resnet50 part
print("we are at Resnet50_model")
Resnet50_checkpointer = ModelCheckpoint(filepath='saved_models/weights.best.Resnet50.hdf5',
verbose=1, save_best_only=True)
Resnet50_model.fit(train_Resnet50, train_targets,
validation_data=(valid_Resnet50, valid_targets),
epochs=20, batch_size=100, callbacks=[Resnet50_checkpointer], verbose=1)
#print("we are at Resnet50_model1")
#Resnet50_checkpointer1 = ModelCheckpoint(filepath='saved_models/weights.best.Resnet501.hdf5',
# verbose=1, save_best_only=True)
#Resnet50_model1.fit(train_Resnet50, train_targets,
# validation_data=(valid_Resnet50, valid_targets),
# epochs=20, batch_size=100, callbacks=[Resnet50_checkpointer1], verbose=1)
print("we are at Resnet50_model2")
Resnet50_checkpointer2 = ModelCheckpoint(filepath='saved_models/weights.best.Resnet502.hdf5',
verbose=1, save_best_only=True)
Resnet50_model2.fit(train_Resnet50, train_targets,
validation_data=(valid_Resnet50, valid_targets),
epochs=20, batch_size=100, callbacks=[Resnet50_checkpointer2], verbose=1)
print("we are at Resnet50_model3")
Resnet50_checkpointer3 = ModelCheckpoint(filepath='saved_models/weights.best.Resnet503.hdf5',
verbose=1, save_best_only=True)
Resnet50_model3.fit(train_Resnet50, train_targets,
validation_data=(valid_Resnet50, valid_targets),
epochs=20, batch_size=100, callbacks=[Resnet50_checkpointer3], verbose=1)
#print("we are at Resnet50_model4")
#Resnet50_checkpointer4 = ModelCheckpoint(filepath='saved_models/weights.best.Resnet504.hdf5',
# verbose=1, save_best_only=True)
#Resnet50_model4.fit(train_Resnet50, train_targets,
# validation_data=(valid_Resnet50, valid_targets),
# epochs=20, batch_size=100, callbacks=[Resnet50_checkpointer4], verbose=1)
# Xception part
print("we are at Xception_model")
Xception_checkpointer = ModelCheckpoint(filepath='saved_models/weights.best.Xception.hdf5',
verbose=1, save_best_only=True)
Xception_model.fit(train_Xception, train_targets,
validation_data=(valid_Xception, valid_targets),
epochs=20, batch_size=100, callbacks=[Xception_checkpointer], verbose=1)
print("we are at Xception_model1")
Xception_checkpointer1 = ModelCheckpoint(filepath='saved_models/weights.best.Xception1.hdf5',
verbose=1, save_best_only=True)
Xception_model1.fit(train_Xception, train_targets,
validation_data=(valid_Xception, valid_targets),
epochs=20, batch_size=100, callbacks=[Xception_checkpointer1], verbose=1)
print("we are at Xception_model2")
Xception_checkpointer2 = ModelCheckpoint(filepath='saved_models/weights.best.Xception2.hdf5',
verbose=1, save_best_only=True)
Xception_model2.fit(train_Xception, train_targets,
validation_data=(valid_Xception, valid_targets),
epochs=20, batch_size=100, callbacks=[Xception_checkpointer2], verbose=1)
print("we are at Xception_model3")
Xception_checkpointer3 = ModelCheckpoint(filepath='saved_models/weights.best.Xception3.hdf5',
verbose=1, save_best_only=True)
Xception_model3.fit(train_Xception, train_targets,
validation_data=(valid_Xception, valid_targets),
epochs=20, batch_size=100, callbacks=[Xception_checkpointer3], verbose=1)
print("we are at Xception_model4")
Xception_checkpointer4 = ModelCheckpoint(filepath='saved_models/weights.best.Xception4.hdf5',
verbose=1, save_best_only=True)
Xception_model4.fit(train_Xception, train_targets,
validation_data=(valid_Xception, valid_targets),
epochs=20, batch_size=100, callbacks=[Xception_checkpointer4], verbose=1)
we are at VGG19_model Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 1:48 - loss: 15.6337 - acc: 0.0000e+ - ETA: 55s - loss: 15.3923 - acc: 0.0300 - ETA: 38s - loss: 15.3971 - acc: 0.033 - ETA: 29s - loss: 15.4807 - acc: 0.030 - ETA: 23s - loss: 15.5437 - acc: 0.028 - ETA: 20s - loss: 15.6395 - acc: 0.023 - ETA: 17s - loss: 15.6618 - acc: 0.022 - ETA: 15s - loss: 15.6382 - acc: 0.025 - ETA: 14s - loss: 15.6625 - acc: 0.023 - ETA: 12s - loss: 15.6634 - acc: 0.023 - ETA: 11s - loss: 15.6754 - acc: 0.022 - ETA: 10s - loss: 15.6854 - acc: 0.022 - ETA: 10s - loss: 15.7063 - acc: 0.021 - ETA: 9s - loss: 15.7127 - acc: 0.021 - ETA: 8s - loss: 15.7077 - acc: 0.02 - ETA: 8s - loss: 15.6527 - acc: 0.02 - ETA: 7s - loss: 15.6517 - acc: 0.02 - ETA: 7s - loss: 15.6507 - acc: 0.02 - ETA: 7s - loss: 15.6584 - acc: 0.02 - ETA: 6s - loss: 15.6652 - acc: 0.02 - ETA: 6s - loss: 15.6791 - acc: 0.02 - ETA: 6s - loss: 15.6698 - acc: 0.02 - ETA: 5s - loss: 15.6512 - acc: 0.02 - ETA: 5s - loss: 15.6573 - acc: 0.02 - ETA: 5s - loss: 15.6632 - acc: 0.02 - ETA: 5s - loss: 15.6698 - acc: 0.02 - ETA: 4s - loss: 15.6687 - acc: 0.02 - ETA: 4s - loss: 15.6790 - acc: 0.02 - ETA: 4s - loss: 15.6774 - acc: 0.02 - ETA: 4s - loss: 15.6710 - acc: 0.02 - ETA: 4s - loss: 15.6698 - acc: 0.02 - ETA: 3s - loss: 15.6788 - acc: 0.02 - ETA: 3s - loss: 15.6553 - acc: 0.02 - ETA: 3s - loss: 15.6312 - acc: 0.02 - ETA: 3s - loss: 15.6359 - acc: 0.02 - ETA: 3s - loss: 15.6320 - acc: 0.02 - ETA: 3s - loss: 15.6217 - acc: 0.02 - ETA: 3s - loss: 15.6094 - acc: 0.02 - ETA: 2s - loss: 15.6076 - acc: 0.02 - ETA: 2s - loss: 15.6083 - acc: 0.02 - ETA: 2s - loss: 15.5980 - acc: 0.02 - ETA: 2s - loss: 15.5912 - acc: 0.02 - ETA: 2s - loss: 15.6035 - acc: 0.02 - ETA: 2s - loss: 15.6005 - acc: 0.02 - ETA: 2s - loss: 15.5905 - acc: 0.02 - ETA: 2s - loss: 15.5880 - acc: 0.03 - ETA: 1s - loss: 15.5787 - acc: 0.03 - ETA: 1s - loss: 15.5805 - acc: 0.03 - ETA: 1s - loss: 15.5750 - acc: 0.03 - ETA: 1s - loss: 15.5762 - acc: 0.03 - ETA: 1s - loss: 15.5805 - acc: 0.03 - ETA: 1s - loss: 15.5785 - acc: 0.03 - ETA: 1s - loss: 15.5795 - acc: 0.03 - ETA: 1s - loss: 15.5656 - acc: 0.03 - ETA: 1s - loss: 15.5610 - acc: 0.03 - ETA: 0s - loss: 15.5623 - acc: 0.03 - ETA: 0s - loss: 15.5531 - acc: 0.03 - ETA: 0s - loss: 15.5517 - acc: 0.03 - ETA: 0s - loss: 15.5554 - acc: 0.03 - ETA: 0s - loss: 15.5460 - acc: 0.03 - ETA: 0s - loss: 15.5450 - acc: 0.03 - ETA: 0s - loss: 15.5395 - acc: 0.03 - ETA: 0s - loss: 15.5462 - acc: 0.03 - ETA: 0s - loss: 15.5526 - acc: 0.03 - ETA: 0s - loss: 15.5439 - acc: 0.03 - ETA: 0s - loss: 15.5380 - acc: 0.03 - 6s 875us/step - loss: 15.5401 - acc: 0.0334 - val_loss: 15.3878 - val_acc: 0.0443 Epoch 00001: val_loss improved from inf to 15.38783, saving model to saved_models/weights.best.VGG19.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 3s - loss: 15.2647 - acc: 0.04 - ETA: 3s - loss: 15.1273 - acc: 0.05 - ETA: 3s - loss: 15.2427 - acc: 0.05 - ETA: 3s - loss: 15.3809 - acc: 0.04 - ETA: 3s - loss: 15.4167 - acc: 0.04 - ETA: 3s - loss: 15.3821 - acc: 0.04 - ETA: 3s - loss: 15.3619 - acc: 0.04 - ETA: 3s - loss: 15.3356 - acc: 0.04 - ETA: 3s - loss: 15.3509 - acc: 0.04 - ETA: 3s - loss: 15.2986 - acc: 0.04 - ETA: 3s - loss: 15.2559 - acc: 0.05 - ETA: 3s - loss: 15.2606 - acc: 0.05 - ETA: 3s - loss: 15.2922 - acc: 0.04 - ETA: 3s - loss: 15.2821 - acc: 0.04 - ETA: 2s - loss: 15.2948 - acc: 0.04 - ETA: 2s - loss: 15.3060 - acc: 0.04 - ETA: 2s - loss: 15.3158 - acc: 0.04 - ETA: 2s - loss: 15.3425 - acc: 0.04 - ETA: 2s - loss: 15.2900 - acc: 0.04 - ETA: 2s - loss: 15.2911 - acc: 0.04 - ETA: 2s - loss: 15.2921 - acc: 0.04 - ETA: 2s - loss: 15.3004 - acc: 0.04 - ETA: 2s - loss: 15.2869 - acc: 0.04 - ETA: 2s - loss: 15.2812 - acc: 0.05 - ETA: 2s - loss: 15.2953 - acc: 0.04 - ETA: 2s - loss: 15.2898 - acc: 0.04 - ETA: 2s - loss: 15.2966 - acc: 0.04 - ETA: 2s - loss: 15.2971 - acc: 0.04 - ETA: 2s - loss: 15.2980 - acc: 0.04 - ETA: 2s - loss: 15.2930 - acc: 0.04 - ETA: 2s - loss: 15.3064 - acc: 0.04 - ETA: 1s - loss: 15.2814 - acc: 0.04 - ETA: 1s - loss: 15.2967 - acc: 0.04 - ETA: 1s - loss: 15.2943 - acc: 0.04 - ETA: 1s - loss: 15.2856 - acc: 0.04 - ETA: 1s - loss: 15.2776 - acc: 0.04 - ETA: 1s - loss: 15.2655 - acc: 0.05 - ETA: 1s - loss: 15.2667 - acc: 0.05 - ETA: 1s - loss: 15.2720 - acc: 0.05 - ETA: 1s - loss: 15.2690 - acc: 0.05 - ETA: 1s - loss: 15.2623 - acc: 0.05 - ETA: 1s - loss: 15.2674 - acc: 0.05 - ETA: 1s - loss: 15.2684 - acc: 0.05 - ETA: 1s - loss: 15.2784 - acc: 0.05 - ETA: 1s - loss: 15.2650 - acc: 0.05 - ETA: 1s - loss: 15.2520 - acc: 0.05 - ETA: 1s - loss: 15.2327 - acc: 0.05 - ETA: 1s - loss: 15.2218 - acc: 0.05 - ETA: 1s - loss: 15.2236 - acc: 0.05 - ETA: 0s - loss: 15.2222 - acc: 0.05 - ETA: 0s - loss: 15.2239 - acc: 0.05 - ETA: 0s - loss: 15.2163 - acc: 0.05 - ETA: 0s - loss: 15.2151 - acc: 0.05 - ETA: 0s - loss: 15.2045 - acc: 0.05 - ETA: 0s - loss: 15.2123 - acc: 0.05 - ETA: 0s - loss: 15.2141 - acc: 0.05 - ETA: 0s - loss: 15.2215 - acc: 0.05 - ETA: 0s - loss: 15.2230 - acc: 0.05 - ETA: 0s - loss: 15.2246 - acc: 0.05 - ETA: 0s - loss: 15.2394 - acc: 0.05 - ETA: 0s - loss: 15.2459 - acc: 0.05 - ETA: 0s - loss: 15.2496 - acc: 0.05 - ETA: 0s - loss: 15.2480 - acc: 0.05 - ETA: 0s - loss: 15.2490 - acc: 0.05 - ETA: 0s - loss: 15.2456 - acc: 0.05 - ETA: 0s - loss: 15.2514 - acc: 0.05 - 4s 591us/step - loss: 15.2594 - acc: 0.0516 - val_loss: 15.3366 - val_acc: 0.0455 Epoch 00002: val_loss improved from 15.38783 to 15.33664, saving model to saved_models/weights.best.VGG19.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 3s - loss: 15.1510 - acc: 0.06 - ETA: 3s - loss: 15.0245 - acc: 0.06 - ETA: 3s - loss: 15.0667 - acc: 0.06 - ETA: 3s - loss: 14.9322 - acc: 0.07 - ETA: 3s - loss: 14.8793 - acc: 0.07 - ETA: 3s - loss: 14.7760 - acc: 0.08 - ETA: 3s - loss: 14.8756 - acc: 0.07 - ETA: 3s - loss: 14.8697 - acc: 0.07 - ETA: 3s - loss: 14.9189 - acc: 0.07 - ETA: 3s - loss: 14.9413 - acc: 0.07 - ETA: 3s - loss: 14.8725 - acc: 0.07 - ETA: 3s - loss: 14.8823 - acc: 0.07 - ETA: 3s - loss: 14.9029 - acc: 0.07 - ETA: 3s - loss: 14.9366 - acc: 0.07 - ETA: 2s - loss: 14.9388 - acc: 0.07 - ETA: 2s - loss: 15.0024 - acc: 0.06 - ETA: 2s - loss: 14.9981 - acc: 0.06 - ETA: 2s - loss: 14.9887 - acc: 0.06 - ETA: 2s - loss: 14.9633 - acc: 0.07 - ETA: 2s - loss: 14.9566 - acc: 0.07 - ETA: 2s - loss: 14.9966 - acc: 0.06 - ETA: 2s - loss: 14.9816 - acc: 0.06 - ETA: 2s - loss: 15.0059 - acc: 0.06 - ETA: 2s - loss: 15.0120 - acc: 0.06 - ETA: 2s - loss: 15.0304 - acc: 0.06 - ETA: 2s - loss: 15.0227 - acc: 0.06 - ETA: 2s - loss: 15.0274 - acc: 0.06 - ETA: 2s - loss: 15.0318 - acc: 0.06 - ETA: 2s - loss: 15.0384 - acc: 0.06 - ETA: 2s - loss: 15.0529 - acc: 0.06 - ETA: 2s - loss: 15.0664 - acc: 0.06 - ETA: 1s - loss: 15.0834 - acc: 0.06 - ETA: 1s - loss: 15.1049 - acc: 0.06 - ETA: 1s - loss: 15.1158 - acc: 0.06 - ETA: 1s - loss: 15.1306 - acc: 0.06 - ETA: 1s - loss: 15.1222 - acc: 0.06 - ETA: 1s - loss: 15.1230 - acc: 0.06 - ETA: 1s - loss: 15.1280 - acc: 0.06 - ETA: 1s - loss: 15.1244 - acc: 0.06 - ETA: 1s - loss: 15.1372 - acc: 0.05 - ETA: 1s - loss: 15.1336 - acc: 0.06 - ETA: 1s - loss: 15.1363 - acc: 0.05 - ETA: 1s - loss: 15.1404 - acc: 0.05 - ETA: 1s - loss: 15.1480 - acc: 0.05 - ETA: 1s - loss: 15.1552 - acc: 0.05 - ETA: 1s - loss: 15.1656 - acc: 0.05 - ETA: 1s - loss: 15.1535 - acc: 0.05 - ETA: 1s - loss: 15.1433 - acc: 0.05 - ETA: 0s - loss: 15.1435 - acc: 0.05 - ETA: 0s - loss: 15.1372 - acc: 0.05 - ETA: 0s - loss: 15.1501 - acc: 0.05 - ETA: 0s - loss: 15.1439 - acc: 0.05 - ETA: 0s - loss: 15.1334 - acc: 0.06 - ETA: 0s - loss: 15.1427 - acc: 0.05 - ETA: 0s - loss: 15.1546 - acc: 0.05 - ETA: 0s - loss: 15.1488 - acc: 0.05 - ETA: 0s - loss: 15.1545 - acc: 0.05 - ETA: 0s - loss: 15.1544 - acc: 0.05 - ETA: 0s - loss: 15.1543 - acc: 0.05 - ETA: 0s - loss: 15.1516 - acc: 0.05 - ETA: 0s - loss: 15.1490 - acc: 0.05 - ETA: 0s - loss: 15.1516 - acc: 0.05 - ETA: 0s - loss: 15.1505 - acc: 0.05 - ETA: 0s - loss: 15.1555 - acc: 0.05 - ETA: 0s - loss: 15.1530 - acc: 0.05 - ETA: 0s - loss: 15.1529 - acc: 0.05 - 4s 578us/step - loss: 15.1452 - acc: 0.0594 - val_loss: 15.1258 - val_acc: 0.0611 Epoch 00003: val_loss improved from 15.33664 to 15.12578, saving model to saved_models/weights.best.VGG19.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 3s - loss: 15.6346 - acc: 0.03 - ETA: 3s - loss: 15.6346 - acc: 0.03 - ETA: 3s - loss: 15.5271 - acc: 0.03 - ETA: 3s - loss: 15.5540 - acc: 0.03 - ETA: 3s - loss: 15.3122 - acc: 0.05 - ETA: 3s - loss: 15.2316 - acc: 0.05 - ETA: 3s - loss: 15.2431 - acc: 0.05 - ETA: 3s - loss: 15.2517 - acc: 0.05 - ETA: 3s - loss: 15.2553 - acc: 0.05 - ETA: 3s - loss: 15.2739 - acc: 0.05 - ETA: 3s - loss: 15.2188 - acc: 0.05 - ETA: 3s - loss: 15.1728 - acc: 0.05 - ETA: 2s - loss: 15.1835 - acc: 0.05 - ETA: 2s - loss: 15.1472 - acc: 0.06 - ETA: 2s - loss: 15.1475 - acc: 0.06 - ETA: 2s - loss: 15.1724 - acc: 0.05 - ETA: 2s - loss: 15.1614 - acc: 0.05 - ETA: 2s - loss: 15.1519 - acc: 0.05 - ETA: 2s - loss: 15.1349 - acc: 0.06 - ETA: 2s - loss: 15.1276 - acc: 0.06 - ETA: 2s - loss: 15.0980 - acc: 0.06 - ETA: 2s - loss: 15.0638 - acc: 0.06 - ETA: 2s - loss: 15.0746 - acc: 0.06 - ETA: 2s - loss: 15.0778 - acc: 0.06 - ETA: 2s - loss: 15.0807 - acc: 0.06 - ETA: 2s - loss: 15.0834 - acc: 0.06 - ETA: 2s - loss: 15.0919 - acc: 0.06 - ETA: 2s - loss: 15.0861 - acc: 0.06 - ETA: 2s - loss: 15.0883 - acc: 0.06 - ETA: 2s - loss: 15.0528 - acc: 0.06 - ETA: 1s - loss: 15.0300 - acc: 0.06 - ETA: 1s - loss: 15.0288 - acc: 0.06 - ETA: 1s - loss: 15.0276 - acc: 0.06 - ETA: 1s - loss: 15.0313 - acc: 0.06 - ETA: 1s - loss: 15.0347 - acc: 0.06 - ETA: 1s - loss: 15.0424 - acc: 0.06 - ETA: 1s - loss: 15.0497 - acc: 0.06 - ETA: 1s - loss: 15.0550 - acc: 0.06 - ETA: 1s - loss: 15.0607 - acc: 0.06 - ETA: 1s - loss: 15.0750 - acc: 0.06 - ETA: 1s - loss: 15.0729 - acc: 0.06 - ETA: 1s - loss: 15.0825 - acc: 0.06 - ETA: 1s - loss: 15.0766 - acc: 0.06 - ETA: 1s - loss: 15.0820 - acc: 0.06 - ETA: 1s - loss: 15.0728 - acc: 0.06 - ETA: 1s - loss: 15.0639 - acc: 0.06 - ETA: 1s - loss: 15.0521 - acc: 0.06 - ETA: 1s - loss: 15.0642 - acc: 0.06 - ETA: 0s - loss: 15.0726 - acc: 0.06 - ETA: 0s - loss: 15.0638 - acc: 0.06 - ETA: 0s - loss: 15.0623 - acc: 0.06 - ETA: 0s - loss: 15.0516 - acc: 0.06 - ETA: 0s - loss: 15.0535 - acc: 0.06 - ETA: 0s - loss: 15.0553 - acc: 0.06 - ETA: 0s - loss: 15.0512 - acc: 0.06 - ETA: 0s - loss: 15.0520 - acc: 0.06 - ETA: 0s - loss: 15.0594 - acc: 0.06 - ETA: 0s - loss: 15.0637 - acc: 0.06 - ETA: 0s - loss: 15.0679 - acc: 0.06 - ETA: 0s - loss: 15.0682 - acc: 0.06 - ETA: 0s - loss: 15.0722 - acc: 0.06 - ETA: 0s - loss: 15.0625 - acc: 0.06 - ETA: 0s - loss: 15.0690 - acc: 0.06 - ETA: 0s - loss: 15.0669 - acc: 0.06 - ETA: 0s - loss: 15.0694 - acc: 0.06 - ETA: 0s - loss: 15.0755 - acc: 0.06 - 4s 576us/step - loss: 15.0711 - acc: 0.0644 - val_loss: 15.0876 - val_acc: 0.0635 Epoch 00004: val_loss improved from 15.12578 to 15.08760, saving model to saved_models/weights.best.VGG19.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 3s - loss: 15.3122 - acc: 0.05 - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 15.0436 - acc: 0.06 - ETA: 3s - loss: 15.0704 - acc: 0.06 - ETA: 3s - loss: 15.0221 - acc: 0.06 - ETA: 3s - loss: 14.8287 - acc: 0.08 - ETA: 3s - loss: 14.8517 - acc: 0.07 - ETA: 3s - loss: 14.8891 - acc: 0.07 - ETA: 3s - loss: 14.8645 - acc: 0.07 - ETA: 3s - loss: 14.8609 - acc: 0.07 - ETA: 3s - loss: 14.8433 - acc: 0.07 - ETA: 3s - loss: 14.8421 - acc: 0.07 - ETA: 3s - loss: 14.8411 - acc: 0.07 - ETA: 2s - loss: 14.8747 - acc: 0.07 - ETA: 2s - loss: 14.8716 - acc: 0.07 - ETA: 2s - loss: 14.8891 - acc: 0.07 - ETA: 2s - loss: 14.8950 - acc: 0.07 - ETA: 2s - loss: 14.9182 - acc: 0.07 - ETA: 2s - loss: 14.8626 - acc: 0.07 - ETA: 2s - loss: 14.8770 - acc: 0.07 - ETA: 2s - loss: 14.8759 - acc: 0.07 - ETA: 2s - loss: 14.8811 - acc: 0.07 - ETA: 2s - loss: 14.9068 - acc: 0.07 - ETA: 2s - loss: 14.9036 - acc: 0.07 - ETA: 2s - loss: 14.9135 - acc: 0.07 - ETA: 2s - loss: 14.9191 - acc: 0.07 - ETA: 2s - loss: 14.9083 - acc: 0.07 - ETA: 2s - loss: 14.9054 - acc: 0.07 - ETA: 2s - loss: 14.9195 - acc: 0.07 - ETA: 2s - loss: 14.9111 - acc: 0.07 - ETA: 2s - loss: 14.9240 - acc: 0.07 - ETA: 2s - loss: 14.9190 - acc: 0.07 - ETA: 1s - loss: 14.9140 - acc: 0.07 - ETA: 1s - loss: 14.9114 - acc: 0.07 - ETA: 1s - loss: 14.9045 - acc: 0.07 - ETA: 1s - loss: 14.9113 - acc: 0.07 - ETA: 1s - loss: 14.9091 - acc: 0.07 - ETA: 1s - loss: 14.9070 - acc: 0.07 - ETA: 1s - loss: 14.9050 - acc: 0.07 - ETA: 1s - loss: 14.8910 - acc: 0.07 - ETA: 1s - loss: 14.8855 - acc: 0.07 - ETA: 1s - loss: 14.8995 - acc: 0.07 - ETA: 1s - loss: 14.9016 - acc: 0.07 - ETA: 1s - loss: 14.9220 - acc: 0.07 - ETA: 1s - loss: 14.9123 - acc: 0.07 - ETA: 1s - loss: 14.9071 - acc: 0.07 - ETA: 1s - loss: 14.9020 - acc: 0.07 - ETA: 1s - loss: 14.9063 - acc: 0.07 - ETA: 1s - loss: 14.8981 - acc: 0.07 - ETA: 0s - loss: 14.8976 - acc: 0.07 - ETA: 0s - loss: 14.8931 - acc: 0.07 - ETA: 0s - loss: 14.8918 - acc: 0.07 - ETA: 0s - loss: 14.8937 - acc: 0.07 - ETA: 0s - loss: 14.8865 - acc: 0.07 - ETA: 0s - loss: 14.8796 - acc: 0.07 - ETA: 0s - loss: 14.8844 - acc: 0.07 - ETA: 0s - loss: 14.8778 - acc: 0.07 - ETA: 0s - loss: 14.8881 - acc: 0.07 - ETA: 0s - loss: 14.8898 - acc: 0.07 - ETA: 0s - loss: 14.8888 - acc: 0.07 - ETA: 0s - loss: 14.8984 - acc: 0.07 - ETA: 0s - loss: 14.8946 - acc: 0.07 - ETA: 0s - loss: 14.8962 - acc: 0.07 - ETA: 0s - loss: 14.9077 - acc: 0.07 - ETA: 0s - loss: 14.9051 - acc: 0.07 - ETA: 0s - loss: 14.8917 - acc: 0.07 - 4s 594us/step - loss: 14.8847 - acc: 0.0757 - val_loss: 14.8441 - val_acc: 0.0790 Epoch 00005: val_loss improved from 15.08760 to 14.84410, saving model to saved_models/weights.best.VGG19.hdf5 Epoch 6/20 6680/6680 [==============================] - ETA: 3s - loss: 13.7004 - acc: 0.15 - ETA: 3s - loss: 14.4257 - acc: 0.10 - ETA: 3s - loss: 14.8287 - acc: 0.08 - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 14.8931 - acc: 0.07 - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 15.0359 - acc: 0.06 - ETA: 3s - loss: 15.0301 - acc: 0.06 - ETA: 3s - loss: 14.9003 - acc: 0.07 - ETA: 3s - loss: 14.8931 - acc: 0.07 - ETA: 3s - loss: 14.8726 - acc: 0.07 - ETA: 3s - loss: 14.8555 - acc: 0.07 - ETA: 2s - loss: 14.8658 - acc: 0.07 - ETA: 2s - loss: 14.8633 - acc: 0.07 - ETA: 2s - loss: 14.8610 - acc: 0.07 - ETA: 2s - loss: 14.8388 - acc: 0.07 - ETA: 2s - loss: 14.8098 - acc: 0.08 - ETA: 2s - loss: 14.8198 - acc: 0.08 - ETA: 2s - loss: 14.8118 - acc: 0.08 - ETA: 2s - loss: 14.8368 - acc: 0.07 - ETA: 2s - loss: 14.8210 - acc: 0.08 - ETA: 2s - loss: 14.8225 - acc: 0.08 - ETA: 2s - loss: 14.8228 - acc: 0.08 - ETA: 2s - loss: 14.8364 - acc: 0.07 - ETA: 2s - loss: 14.8748 - acc: 0.07 - ETA: 2s - loss: 14.8606 - acc: 0.07 - ETA: 2s - loss: 14.8714 - acc: 0.07 - ETA: 2s - loss: 14.8699 - acc: 0.07 - ETA: 2s - loss: 14.8573 - acc: 0.07 - ETA: 2s - loss: 14.8403 - acc: 0.07 - ETA: 2s - loss: 14.8347 - acc: 0.07 - ETA: 2s - loss: 14.8295 - acc: 0.07 - ETA: 1s - loss: 14.8197 - acc: 0.08 - ETA: 1s - loss: 14.8342 - acc: 0.07 - ETA: 1s - loss: 14.8340 - acc: 0.07 - ETA: 1s - loss: 14.8294 - acc: 0.07 - ETA: 1s - loss: 14.8294 - acc: 0.07 - ETA: 1s - loss: 14.8124 - acc: 0.08 - ETA: 1s - loss: 14.8087 - acc: 0.08 - ETA: 1s - loss: 14.8173 - acc: 0.08 - ETA: 1s - loss: 14.8179 - acc: 0.08 - ETA: 1s - loss: 14.8335 - acc: 0.07 - ETA: 1s - loss: 14.8222 - acc: 0.08 - ETA: 1s - loss: 14.8260 - acc: 0.07 - ETA: 1s - loss: 14.8332 - acc: 0.07 - ETA: 1s - loss: 14.8296 - acc: 0.07 - ETA: 1s - loss: 14.8330 - acc: 0.07 - ETA: 1s - loss: 14.8329 - acc: 0.07 - ETA: 1s - loss: 14.8230 - acc: 0.08 - ETA: 0s - loss: 14.8167 - acc: 0.08 - ETA: 0s - loss: 14.8107 - acc: 0.08 - ETA: 0s - loss: 14.8203 - acc: 0.08 - ETA: 0s - loss: 14.8174 - acc: 0.08 - ETA: 0s - loss: 14.8206 - acc: 0.08 - ETA: 0s - loss: 14.8266 - acc: 0.07 - ETA: 0s - loss: 14.8295 - acc: 0.07 - ETA: 0s - loss: 14.8408 - acc: 0.07 - ETA: 0s - loss: 14.8391 - acc: 0.07 - ETA: 0s - loss: 14.8279 - acc: 0.07 - ETA: 0s - loss: 14.8306 - acc: 0.07 - ETA: 0s - loss: 14.8383 - acc: 0.07 - ETA: 0s - loss: 14.8329 - acc: 0.07 - ETA: 0s - loss: 14.8405 - acc: 0.07 - ETA: 0s - loss: 14.8479 - acc: 0.07 - ETA: 0s - loss: 14.8442 - acc: 0.07 - ETA: 0s - loss: 14.8277 - acc: 0.07 - 4s 608us/step - loss: 14.8335 - acc: 0.0792 - val_loss: 15.0719 - val_acc: 0.0623 Epoch 00006: val_loss did not improve Epoch 7/20 6680/6680 [==============================] - ETA: 4s - loss: 14.1839 - acc: 0.12 - ETA: 4s - loss: 14.4151 - acc: 0.10 - ETA: 4s - loss: 14.5949 - acc: 0.09 - ETA: 3s - loss: 14.7340 - acc: 0.08 - ETA: 3s - loss: 14.7529 - acc: 0.08 - ETA: 3s - loss: 14.8730 - acc: 0.07 - ETA: 3s - loss: 14.7976 - acc: 0.08 - ETA: 3s - loss: 14.7612 - acc: 0.08 - ETA: 3s - loss: 14.7866 - acc: 0.08 - ETA: 3s - loss: 14.8230 - acc: 0.08 - ETA: 3s - loss: 14.7942 - acc: 0.08 - ETA: 3s - loss: 14.7641 - acc: 0.08 - ETA: 3s - loss: 14.7815 - acc: 0.08 - ETA: 3s - loss: 14.7618 - acc: 0.08 - ETA: 3s - loss: 14.7878 - acc: 0.08 - ETA: 3s - loss: 14.8306 - acc: 0.07 - ETA: 3s - loss: 14.8590 - acc: 0.07 - ETA: 3s - loss: 14.8573 - acc: 0.07 - ETA: 2s - loss: 14.8812 - acc: 0.07 - ETA: 2s - loss: 14.8786 - acc: 0.07 - ETA: 2s - loss: 14.9001 - acc: 0.07 - ETA: 2s - loss: 14.9115 - acc: 0.07 - ETA: 2s - loss: 14.9289 - acc: 0.07 - ETA: 2s - loss: 14.9315 - acc: 0.07 - ETA: 2s - loss: 14.9080 - acc: 0.07 - ETA: 2s - loss: 14.8722 - acc: 0.07 - ETA: 2s - loss: 14.8765 - acc: 0.07 - ETA: 2s - loss: 14.8863 - acc: 0.07 - ETA: 2s - loss: 14.8732 - acc: 0.07 - ETA: 2s - loss: 14.8709 - acc: 0.07 - ETA: 2s - loss: 14.8539 - acc: 0.07 - ETA: 2s - loss: 14.8388 - acc: 0.07 - ETA: 2s - loss: 14.8385 - acc: 0.07 - ETA: 1s - loss: 14.8430 - acc: 0.07 - ETA: 1s - loss: 14.8287 - acc: 0.07 - ETA: 1s - loss: 14.8202 - acc: 0.07 - ETA: 1s - loss: 14.8335 - acc: 0.07 - ETA: 1s - loss: 14.8206 - acc: 0.07 - ETA: 1s - loss: 14.8332 - acc: 0.07 - ETA: 1s - loss: 14.8452 - acc: 0.07 - ETA: 1s - loss: 14.8527 - acc: 0.07 - ETA: 1s - loss: 14.8636 - acc: 0.07 - ETA: 1s - loss: 14.8553 - acc: 0.07 - ETA: 1s - loss: 14.8400 - acc: 0.07 - ETA: 1s - loss: 14.8326 - acc: 0.07 - ETA: 1s - loss: 14.8431 - acc: 0.07 - ETA: 1s - loss: 14.8290 - acc: 0.07 - ETA: 1s - loss: 14.8223 - acc: 0.07 - ETA: 1s - loss: 14.8126 - acc: 0.08 - ETA: 0s - loss: 14.8290 - acc: 0.07 - ETA: 0s - loss: 14.8353 - acc: 0.07 - ETA: 0s - loss: 14.8352 - acc: 0.07 - ETA: 0s - loss: 14.8320 - acc: 0.07 - ETA: 0s - loss: 14.8320 - acc: 0.07 - ETA: 0s - loss: 14.8261 - acc: 0.07 - ETA: 0s - loss: 14.8261 - acc: 0.07 - ETA: 0s - loss: 14.8318 - acc: 0.07 - ETA: 0s - loss: 14.8290 - acc: 0.07 - ETA: 0s - loss: 14.8262 - acc: 0.07 - ETA: 0s - loss: 14.8316 - acc: 0.07 - ETA: 0s - loss: 14.8157 - acc: 0.08 - ETA: 0s - loss: 14.8186 - acc: 0.08 - ETA: 0s - loss: 14.8264 - acc: 0.07 - ETA: 0s - loss: 14.8264 - acc: 0.07 - ETA: 0s - loss: 14.8215 - acc: 0.07 - ETA: 0s - loss: 14.8240 - acc: 0.07 - 4s 597us/step - loss: 14.8202 - acc: 0.0799 - val_loss: 14.8348 - val_acc: 0.0790 Epoch 00007: val_loss improved from 14.84410 to 14.83481, saving model to saved_models/weights.best.VGG19.hdf5 Epoch 8/20 6680/6680 [==============================] - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 14.9092 - acc: 0.07 - ETA: 3s - loss: 14.7749 - acc: 0.08 - ETA: 3s - loss: 14.4660 - acc: 0.10 - ETA: 3s - loss: 14.4418 - acc: 0.10 - ETA: 3s - loss: 14.6406 - acc: 0.09 - ETA: 3s - loss: 14.8056 - acc: 0.08 - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 14.7749 - acc: 0.08 - ETA: 3s - loss: 14.6352 - acc: 0.09 - ETA: 3s - loss: 14.6678 - acc: 0.09 - ETA: 2s - loss: 14.7080 - acc: 0.08 - ETA: 2s - loss: 14.7421 - acc: 0.08 - ETA: 2s - loss: 14.7828 - acc: 0.08 - ETA: 2s - loss: 14.8504 - acc: 0.07 - ETA: 2s - loss: 14.8691 - acc: 0.07 - ETA: 2s - loss: 14.8857 - acc: 0.07 - ETA: 2s - loss: 14.8430 - acc: 0.07 - ETA: 2s - loss: 14.8083 - acc: 0.08 - ETA: 2s - loss: 14.8255 - acc: 0.08 - ETA: 2s - loss: 14.8103 - acc: 0.08 - ETA: 2s - loss: 14.8184 - acc: 0.08 - ETA: 2s - loss: 14.8189 - acc: 0.08 - ETA: 2s - loss: 14.8327 - acc: 0.07 - ETA: 2s - loss: 14.8455 - acc: 0.07 - ETA: 2s - loss: 14.8448 - acc: 0.07 - ETA: 2s - loss: 14.8621 - acc: 0.07 - ETA: 2s - loss: 14.8379 - acc: 0.07 - ETA: 2s - loss: 14.8313 - acc: 0.07 - ETA: 2s - loss: 14.8527 - acc: 0.07 - ETA: 1s - loss: 14.8467 - acc: 0.07 - ETA: 1s - loss: 14.8612 - acc: 0.07 - ETA: 1s - loss: 14.8600 - acc: 0.07 - ETA: 1s - loss: 14.8633 - acc: 0.07 - ETA: 1s - loss: 14.8715 - acc: 0.07 - ETA: 1s - loss: 14.8883 - acc: 0.07 - ETA: 1s - loss: 14.8779 - acc: 0.07 - ETA: 1s - loss: 14.8766 - acc: 0.07 - ETA: 1s - loss: 14.8547 - acc: 0.07 - ETA: 1s - loss: 14.8742 - acc: 0.07 - ETA: 1s - loss: 14.8692 - acc: 0.07 - ETA: 1s - loss: 14.8836 - acc: 0.07 - ETA: 1s - loss: 14.8711 - acc: 0.07 - ETA: 1s - loss: 14.8591 - acc: 0.07 - ETA: 1s - loss: 14.8477 - acc: 0.07 - ETA: 1s - loss: 14.8333 - acc: 0.07 - ETA: 1s - loss: 14.8297 - acc: 0.07 - ETA: 1s - loss: 14.8364 - acc: 0.07 - ETA: 0s - loss: 14.8395 - acc: 0.07 - ETA: 0s - loss: 14.8264 - acc: 0.08 - ETA: 0s - loss: 14.8265 - acc: 0.08 - ETA: 0s - loss: 14.8358 - acc: 0.07 - ETA: 0s - loss: 14.8266 - acc: 0.08 - ETA: 0s - loss: 14.8236 - acc: 0.08 - ETA: 0s - loss: 14.8296 - acc: 0.07 - ETA: 0s - loss: 14.8382 - acc: 0.07 - ETA: 0s - loss: 14.8493 - acc: 0.07 - ETA: 0s - loss: 14.8434 - acc: 0.07 - ETA: 0s - loss: 14.8404 - acc: 0.07 - ETA: 0s - loss: 14.8376 - acc: 0.07 - ETA: 0s - loss: 14.8453 - acc: 0.07 - ETA: 0s - loss: 14.8347 - acc: 0.07 - ETA: 0s - loss: 14.8320 - acc: 0.07 - ETA: 0s - loss: 14.8294 - acc: 0.07 - ETA: 0s - loss: 14.8294 - acc: 0.07 - ETA: 0s - loss: 14.8221 - acc: 0.08 - 4s 569us/step - loss: 14.8231 - acc: 0.0802 - val_loss: 14.8034 - val_acc: 0.0814 Epoch 00008: val_loss improved from 14.83481 to 14.80339, saving model to saved_models/weights.best.VGG19.hdf5 Epoch 9/20 6680/6680 [==============================] - ETA: 3s - loss: 14.6675 - acc: 0.09 - ETA: 3s - loss: 15.0704 - acc: 0.06 - ETA: 3s - loss: 15.2047 - acc: 0.05 - ETA: 3s - loss: 15.1510 - acc: 0.06 - ETA: 3s - loss: 15.0865 - acc: 0.06 - ETA: 3s - loss: 15.1241 - acc: 0.06 - ETA: 3s - loss: 14.9668 - acc: 0.07 - ETA: 3s - loss: 15.0100 - acc: 0.06 - ETA: 3s - loss: 14.9540 - acc: 0.07 - ETA: 3s - loss: 15.0221 - acc: 0.06 - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 2s - loss: 14.9898 - acc: 0.07 - ETA: 2s - loss: 14.9402 - acc: 0.07 - ETA: 2s - loss: 15.0013 - acc: 0.06 - ETA: 2s - loss: 15.0006 - acc: 0.06 - ETA: 2s - loss: 14.9898 - acc: 0.07 - ETA: 2s - loss: 14.9803 - acc: 0.07 - ETA: 2s - loss: 14.9988 - acc: 0.06 - ETA: 2s - loss: 14.9304 - acc: 0.07 - ETA: 2s - loss: 14.9576 - acc: 0.07 - ETA: 2s - loss: 14.9361 - acc: 0.07 - ETA: 2s - loss: 14.9385 - acc: 0.07 - ETA: 2s - loss: 14.9268 - acc: 0.07 - ETA: 2s - loss: 14.8958 - acc: 0.07 - ETA: 2s - loss: 14.8802 - acc: 0.07 - ETA: 2s - loss: 14.8906 - acc: 0.07 - ETA: 2s - loss: 14.8824 - acc: 0.07 - ETA: 2s - loss: 14.9035 - acc: 0.07 - ETA: 2s - loss: 14.8787 - acc: 0.07 - ETA: 2s - loss: 14.8716 - acc: 0.07 - ETA: 1s - loss: 14.8754 - acc: 0.07 - ETA: 1s - loss: 14.8790 - acc: 0.07 - ETA: 1s - loss: 14.8726 - acc: 0.07 - ETA: 1s - loss: 14.8713 - acc: 0.07 - ETA: 1s - loss: 14.8793 - acc: 0.07 - ETA: 1s - loss: 14.8958 - acc: 0.07 - ETA: 1s - loss: 14.8722 - acc: 0.07 - ETA: 1s - loss: 14.8668 - acc: 0.07 - ETA: 1s - loss: 14.8493 - acc: 0.07 - ETA: 1s - loss: 14.8407 - acc: 0.07 - ETA: 1s - loss: 14.8365 - acc: 0.07 - ETA: 1s - loss: 14.8363 - acc: 0.07 - ETA: 1s - loss: 14.8361 - acc: 0.07 - ETA: 1s - loss: 14.8396 - acc: 0.07 - ETA: 1s - loss: 14.8286 - acc: 0.08 - ETA: 1s - loss: 14.8251 - acc: 0.08 - ETA: 1s - loss: 14.8286 - acc: 0.08 - ETA: 1s - loss: 14.8320 - acc: 0.07 - ETA: 0s - loss: 14.8286 - acc: 0.08 - ETA: 0s - loss: 14.8093 - acc: 0.08 - ETA: 0s - loss: 14.8097 - acc: 0.08 - ETA: 0s - loss: 14.7977 - acc: 0.08 - ETA: 0s - loss: 14.8013 - acc: 0.08 - ETA: 0s - loss: 14.7898 - acc: 0.08 - ETA: 0s - loss: 14.7906 - acc: 0.08 - ETA: 0s - loss: 14.7912 - acc: 0.08 - ETA: 0s - loss: 14.7862 - acc: 0.08 - ETA: 0s - loss: 14.7842 - acc: 0.08 - ETA: 0s - loss: 14.7822 - acc: 0.08 - ETA: 0s - loss: 14.7776 - acc: 0.08 - ETA: 0s - loss: 14.7864 - acc: 0.08 - ETA: 0s - loss: 14.7871 - acc: 0.08 - ETA: 0s - loss: 14.7826 - acc: 0.08 - ETA: 0s - loss: 14.7884 - acc: 0.08 - ETA: 0s - loss: 14.7865 - acc: 0.08 - ETA: 0s - loss: 14.7896 - acc: 0.08 - 4s 576us/step - loss: 14.7910 - acc: 0.0823 - val_loss: 14.8110 - val_acc: 0.0802 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 3s - loss: 14.1839 - acc: 0.12 - ETA: 3s - loss: 14.4257 - acc: 0.10 - ETA: 3s - loss: 14.5600 - acc: 0.09 - ETA: 3s - loss: 14.6675 - acc: 0.09 - ETA: 3s - loss: 14.6352 - acc: 0.09 - ETA: 3s - loss: 14.7212 - acc: 0.08 - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 14.8085 - acc: 0.08 - ETA: 3s - loss: 14.8824 - acc: 0.07 - ETA: 3s - loss: 14.9576 - acc: 0.07 - ETA: 3s - loss: 14.9019 - acc: 0.07 - ETA: 3s - loss: 14.8555 - acc: 0.07 - ETA: 2s - loss: 14.8782 - acc: 0.07 - ETA: 2s - loss: 14.8517 - acc: 0.07 - ETA: 2s - loss: 14.8716 - acc: 0.07 - ETA: 2s - loss: 14.8992 - acc: 0.07 - ETA: 2s - loss: 14.9329 - acc: 0.07 - ETA: 2s - loss: 14.8734 - acc: 0.07 - ETA: 2s - loss: 14.8965 - acc: 0.07 - ETA: 2s - loss: 14.8770 - acc: 0.07 - ETA: 2s - loss: 14.8824 - acc: 0.07 - ETA: 2s - loss: 14.8946 - acc: 0.07 - ETA: 2s - loss: 14.8707 - acc: 0.07 - ETA: 2s - loss: 14.8958 - acc: 0.07 - ETA: 2s - loss: 14.9125 - acc: 0.07 - ETA: 2s - loss: 14.9154 - acc: 0.07 - ETA: 2s - loss: 14.9122 - acc: 0.07 - ETA: 2s - loss: 14.8747 - acc: 0.07 - ETA: 2s - loss: 14.8787 - acc: 0.07 - ETA: 2s - loss: 14.8770 - acc: 0.07 - ETA: 2s - loss: 14.8702 - acc: 0.07 - ETA: 1s - loss: 14.8337 - acc: 0.07 - ETA: 1s - loss: 14.8433 - acc: 0.07 - ETA: 1s - loss: 14.8381 - acc: 0.07 - ETA: 1s - loss: 14.8056 - acc: 0.08 - ETA: 1s - loss: 14.8107 - acc: 0.08 - ETA: 1s - loss: 14.8112 - acc: 0.08 - ETA: 1s - loss: 14.8032 - acc: 0.08 - ETA: 1s - loss: 14.8080 - acc: 0.08 - ETA: 1s - loss: 14.7843 - acc: 0.08 - ETA: 1s - loss: 14.7815 - acc: 0.08 - ETA: 1s - loss: 14.7903 - acc: 0.08 - ETA: 1s - loss: 14.7874 - acc: 0.08 - ETA: 1s - loss: 14.7774 - acc: 0.08 - ETA: 1s - loss: 14.7821 - acc: 0.08 - ETA: 1s - loss: 14.7866 - acc: 0.08 - ETA: 1s - loss: 14.7841 - acc: 0.08 - ETA: 1s - loss: 14.7816 - acc: 0.08 - ETA: 0s - loss: 14.7826 - acc: 0.08 - ETA: 0s - loss: 14.7835 - acc: 0.08 - ETA: 0s - loss: 14.7781 - acc: 0.08 - ETA: 0s - loss: 14.7760 - acc: 0.08 - ETA: 0s - loss: 14.7739 - acc: 0.08 - ETA: 0s - loss: 14.7570 - acc: 0.08 - ETA: 0s - loss: 14.7612 - acc: 0.08 - ETA: 0s - loss: 14.7567 - acc: 0.08 - ETA: 0s - loss: 14.7693 - acc: 0.08 - ETA: 0s - loss: 14.7703 - acc: 0.08 - ETA: 0s - loss: 14.7740 - acc: 0.08 - ETA: 0s - loss: 14.7803 - acc: 0.08 - ETA: 0s - loss: 14.7811 - acc: 0.08 - ETA: 0s - loss: 14.7819 - acc: 0.08 - ETA: 0s - loss: 14.7883 - acc: 0.08 - ETA: 0s - loss: 14.7889 - acc: 0.08 - ETA: 0s - loss: 14.7920 - acc: 0.08 - ETA: 0s - loss: 14.7901 - acc: 0.08 - 4s 573us/step - loss: 14.7915 - acc: 0.0822 - val_loss: 14.8055 - val_acc: 0.0814 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 15.1510 - acc: 0.06 - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 14.5466 - acc: 0.09 - ETA: 3s - loss: 14.4741 - acc: 0.10 - ETA: 3s - loss: 14.3988 - acc: 0.10 - ETA: 3s - loss: 14.3681 - acc: 0.10 - ETA: 3s - loss: 14.3653 - acc: 0.10 - ETA: 3s - loss: 14.4167 - acc: 0.10 - ETA: 3s - loss: 14.4902 - acc: 0.10 - ETA: 3s - loss: 14.4916 - acc: 0.10 - ETA: 3s - loss: 14.4794 - acc: 0.10 - ETA: 2s - loss: 14.4815 - acc: 0.10 - ETA: 2s - loss: 14.5063 - acc: 0.10 - ETA: 2s - loss: 14.5493 - acc: 0.09 - ETA: 2s - loss: 14.5667 - acc: 0.09 - ETA: 2s - loss: 14.5932 - acc: 0.09 - ETA: 2s - loss: 14.6152 - acc: 0.09 - ETA: 2s - loss: 14.6180 - acc: 0.09 - ETA: 2s - loss: 14.6527 - acc: 0.09 - ETA: 2s - loss: 14.6381 - acc: 0.09 - ETA: 2s - loss: 14.6833 - acc: 0.08 - ETA: 2s - loss: 14.6693 - acc: 0.08 - ETA: 2s - loss: 14.6793 - acc: 0.08 - ETA: 2s - loss: 14.6981 - acc: 0.08 - ETA: 2s - loss: 14.6784 - acc: 0.08 - ETA: 2s - loss: 14.6899 - acc: 0.08 - ETA: 2s - loss: 14.6833 - acc: 0.08 - ETA: 2s - loss: 14.6828 - acc: 0.08 - ETA: 2s - loss: 14.7038 - acc: 0.08 - ETA: 1s - loss: 14.7182 - acc: 0.08 - ETA: 1s - loss: 14.7317 - acc: 0.08 - ETA: 1s - loss: 14.7444 - acc: 0.08 - ETA: 1s - loss: 14.7374 - acc: 0.08 - ETA: 1s - loss: 14.7538 - acc: 0.08 - ETA: 1s - loss: 14.7559 - acc: 0.08 - ETA: 1s - loss: 14.7753 - acc: 0.08 - ETA: 1s - loss: 14.7767 - acc: 0.08 - ETA: 1s - loss: 14.7904 - acc: 0.08 - ETA: 1s - loss: 14.8035 - acc: 0.08 - ETA: 1s - loss: 14.7766 - acc: 0.08 - ETA: 1s - loss: 14.7893 - acc: 0.08 - ETA: 1s - loss: 14.7902 - acc: 0.08 - ETA: 1s - loss: 14.8058 - acc: 0.08 - ETA: 1s - loss: 14.7933 - acc: 0.08 - ETA: 1s - loss: 14.8046 - acc: 0.08 - ETA: 1s - loss: 14.8154 - acc: 0.08 - ETA: 1s - loss: 14.8224 - acc: 0.07 - ETA: 0s - loss: 14.8258 - acc: 0.07 - ETA: 0s - loss: 14.8226 - acc: 0.07 - ETA: 0s - loss: 14.8354 - acc: 0.07 - ETA: 0s - loss: 14.8384 - acc: 0.07 - ETA: 0s - loss: 14.8443 - acc: 0.07 - ETA: 0s - loss: 14.8499 - acc: 0.07 - ETA: 0s - loss: 14.8408 - acc: 0.07 - ETA: 0s - loss: 14.8292 - acc: 0.07 - ETA: 0s - loss: 14.8349 - acc: 0.07 - ETA: 0s - loss: 14.8292 - acc: 0.07 - ETA: 0s - loss: 14.8292 - acc: 0.07 - ETA: 0s - loss: 14.8211 - acc: 0.07 - ETA: 0s - loss: 14.8186 - acc: 0.08 - ETA: 0s - loss: 14.8266 - acc: 0.07 - ETA: 0s - loss: 14.8189 - acc: 0.08 - ETA: 0s - loss: 14.8216 - acc: 0.07 - ETA: 0s - loss: 14.8118 - acc: 0.08 - ETA: 0s - loss: 14.8047 - acc: 0.08 - 4s 571us/step - loss: 14.7891 - acc: 0.0819 - val_loss: 14.7970 - val_acc: 0.0814 Epoch 00011: val_loss improved from 14.80339 to 14.79701, saving model to saved_models/weights.best.VGG19.hdf5 Epoch 12/20 6680/6680 [==============================] - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 14.7481 - acc: 0.08 - ETA: 3s - loss: 14.7749 - acc: 0.08 - ETA: 3s - loss: 14.7481 - acc: 0.08 - ETA: 3s - loss: 14.7964 - acc: 0.08 - ETA: 3s - loss: 14.8555 - acc: 0.07 - ETA: 3s - loss: 14.8517 - acc: 0.07 - ETA: 3s - loss: 14.8689 - acc: 0.07 - ETA: 3s - loss: 14.7749 - acc: 0.08 - ETA: 3s - loss: 14.7642 - acc: 0.08 - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 14.7615 - acc: 0.08 - ETA: 2s - loss: 14.7915 - acc: 0.08 - ETA: 2s - loss: 14.7941 - acc: 0.08 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8456 - acc: 0.07 - ETA: 2s - loss: 14.8448 - acc: 0.07 - ETA: 2s - loss: 14.8517 - acc: 0.07 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8085 - acc: 0.08 - ETA: 2s - loss: 14.8158 - acc: 0.08 - ETA: 2s - loss: 14.8348 - acc: 0.07 - ETA: 2s - loss: 14.8227 - acc: 0.08 - ETA: 2s - loss: 14.8344 - acc: 0.07 - ETA: 2s - loss: 14.8342 - acc: 0.07 - ETA: 2s - loss: 14.8179 - acc: 0.08 - ETA: 2s - loss: 14.8338 - acc: 0.07 - ETA: 1s - loss: 14.8186 - acc: 0.08 - ETA: 1s - loss: 14.8286 - acc: 0.08 - ETA: 1s - loss: 14.8334 - acc: 0.07 - ETA: 1s - loss: 14.8286 - acc: 0.08 - ETA: 1s - loss: 14.8376 - acc: 0.07 - ETA: 1s - loss: 14.8504 - acc: 0.07 - ETA: 1s - loss: 14.8456 - acc: 0.07 - ETA: 1s - loss: 14.8245 - acc: 0.08 - ETA: 1s - loss: 14.8045 - acc: 0.08 - ETA: 1s - loss: 14.8129 - acc: 0.08 - ETA: 1s - loss: 14.8133 - acc: 0.08 - ETA: 1s - loss: 14.8137 - acc: 0.08 - ETA: 1s - loss: 14.8177 - acc: 0.08 - ETA: 1s - loss: 14.8179 - acc: 0.08 - ETA: 1s - loss: 14.8357 - acc: 0.07 - ETA: 1s - loss: 14.8218 - acc: 0.08 - ETA: 1s - loss: 14.8085 - acc: 0.08 - ETA: 0s - loss: 14.8188 - acc: 0.08 - ETA: 0s - loss: 14.8351 - acc: 0.07 - ETA: 0s - loss: 14.8255 - acc: 0.08 - ETA: 0s - loss: 14.8256 - acc: 0.08 - ETA: 0s - loss: 14.8317 - acc: 0.07 - ETA: 0s - loss: 14.8346 - acc: 0.07 - ETA: 0s - loss: 14.8345 - acc: 0.07 - ETA: 0s - loss: 14.8200 - acc: 0.08 - ETA: 0s - loss: 14.8202 - acc: 0.08 - ETA: 0s - loss: 14.8203 - acc: 0.08 - ETA: 0s - loss: 14.8177 - acc: 0.08 - ETA: 0s - loss: 14.8179 - acc: 0.08 - ETA: 0s - loss: 14.8234 - acc: 0.08 - ETA: 0s - loss: 14.8235 - acc: 0.08 - ETA: 0s - loss: 14.8133 - acc: 0.08 - ETA: 0s - loss: 14.8060 - acc: 0.08 - ETA: 0s - loss: 14.7964 - acc: 0.08 - ETA: 0s - loss: 14.7847 - acc: 0.08 - 4s 576us/step - loss: 14.7886 - acc: 0.0825 - val_loss: 14.8289 - val_acc: 0.0790 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 3s - loss: 14.3451 - acc: 0.11 - ETA: 3s - loss: 14.7481 - acc: 0.08 - ETA: 3s - loss: 14.7749 - acc: 0.08 - ETA: 3s - loss: 14.6675 - acc: 0.09 - ETA: 3s - loss: 14.7964 - acc: 0.08 - ETA: 3s - loss: 14.7212 - acc: 0.08 - ETA: 3s - loss: 14.7596 - acc: 0.08 - ETA: 3s - loss: 14.8085 - acc: 0.08 - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 14.8125 - acc: 0.08 - ETA: 3s - loss: 14.7847 - acc: 0.08 - ETA: 3s - loss: 14.7346 - acc: 0.08 - ETA: 3s - loss: 14.6923 - acc: 0.08 - ETA: 3s - loss: 14.6905 - acc: 0.08 - ETA: 2s - loss: 14.7319 - acc: 0.08 - ETA: 2s - loss: 14.7380 - acc: 0.08 - ETA: 2s - loss: 14.7528 - acc: 0.08 - ETA: 2s - loss: 14.6854 - acc: 0.08 - ETA: 2s - loss: 14.6675 - acc: 0.09 - ETA: 2s - loss: 14.6514 - acc: 0.09 - ETA: 2s - loss: 14.6598 - acc: 0.09 - ETA: 2s - loss: 14.6748 - acc: 0.08 - ETA: 2s - loss: 14.6464 - acc: 0.09 - ETA: 2s - loss: 14.6205 - acc: 0.09 - ETA: 2s - loss: 14.6352 - acc: 0.09 - ETA: 2s - loss: 14.6365 - acc: 0.09 - ETA: 2s - loss: 14.6257 - acc: 0.09 - ETA: 2s - loss: 14.6502 - acc: 0.09 - ETA: 2s - loss: 14.6675 - acc: 0.09 - ETA: 2s - loss: 14.6621 - acc: 0.09 - ETA: 1s - loss: 14.6571 - acc: 0.09 - ETA: 1s - loss: 14.6423 - acc: 0.09 - ETA: 1s - loss: 14.6479 - acc: 0.09 - ETA: 1s - loss: 14.6485 - acc: 0.09 - ETA: 1s - loss: 14.6629 - acc: 0.09 - ETA: 1s - loss: 14.6719 - acc: 0.08 - ETA: 1s - loss: 14.6936 - acc: 0.08 - ETA: 1s - loss: 14.7099 - acc: 0.08 - ETA: 1s - loss: 14.7129 - acc: 0.08 - ETA: 1s - loss: 14.7239 - acc: 0.08 - ETA: 1s - loss: 14.7107 - acc: 0.08 - ETA: 1s - loss: 14.7212 - acc: 0.08 - ETA: 1s - loss: 14.7312 - acc: 0.08 - ETA: 1s - loss: 14.7407 - acc: 0.08 - ETA: 1s - loss: 14.7391 - acc: 0.08 - ETA: 1s - loss: 14.7481 - acc: 0.08 - ETA: 1s - loss: 14.7635 - acc: 0.08 - ETA: 1s - loss: 14.7581 - acc: 0.08 - ETA: 0s - loss: 14.7596 - acc: 0.08 - ETA: 0s - loss: 14.7803 - acc: 0.08 - ETA: 0s - loss: 14.7749 - acc: 0.08 - ETA: 0s - loss: 14.7853 - acc: 0.08 - ETA: 0s - loss: 14.7891 - acc: 0.08 - ETA: 0s - loss: 14.7928 - acc: 0.08 - ETA: 0s - loss: 14.7935 - acc: 0.08 - ETA: 0s - loss: 14.7941 - acc: 0.08 - ETA: 0s - loss: 14.8032 - acc: 0.08 - ETA: 0s - loss: 14.8064 - acc: 0.08 - ETA: 0s - loss: 14.8041 - acc: 0.08 - ETA: 0s - loss: 14.8045 - acc: 0.08 - ETA: 0s - loss: 14.8075 - acc: 0.08 - ETA: 0s - loss: 14.7949 - acc: 0.08 - ETA: 0s - loss: 14.7954 - acc: 0.08 - ETA: 0s - loss: 14.7909 - acc: 0.08 - ETA: 0s - loss: 14.7989 - acc: 0.08 - ETA: 0s - loss: 14.7896 - acc: 0.08 - 4s 574us/step - loss: 14.7886 - acc: 0.0825 - val_loss: 14.8289 - val_acc: 0.0790 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 15.2316 - acc: 0.05 - ETA: 3s - loss: 14.9361 - acc: 0.07 - ETA: 3s - loss: 14.9495 - acc: 0.07 - ETA: 3s - loss: 14.8931 - acc: 0.07 - ETA: 3s - loss: 14.8018 - acc: 0.08 - ETA: 3s - loss: 14.7826 - acc: 0.08 - ETA: 3s - loss: 14.7884 - acc: 0.08 - ETA: 3s - loss: 14.7212 - acc: 0.08 - ETA: 3s - loss: 14.7642 - acc: 0.08 - ETA: 3s - loss: 14.7847 - acc: 0.08 - ETA: 3s - loss: 14.8152 - acc: 0.08 - ETA: 3s - loss: 14.8410 - acc: 0.07 - ETA: 2s - loss: 14.8517 - acc: 0.07 - ETA: 2s - loss: 14.8179 - acc: 0.08 - ETA: 2s - loss: 14.7884 - acc: 0.08 - ETA: 2s - loss: 14.7907 - acc: 0.08 - ETA: 2s - loss: 14.8018 - acc: 0.08 - ETA: 2s - loss: 14.7693 - acc: 0.08 - ETA: 2s - loss: 14.8125 - acc: 0.08 - ETA: 2s - loss: 14.8287 - acc: 0.08 - ETA: 2s - loss: 14.8213 - acc: 0.08 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8029 - acc: 0.08 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8406 - acc: 0.07 - ETA: 2s - loss: 14.8344 - acc: 0.07 - ETA: 2s - loss: 14.8120 - acc: 0.08 - ETA: 2s - loss: 14.8233 - acc: 0.08 - ETA: 2s - loss: 14.8235 - acc: 0.08 - ETA: 1s - loss: 14.8286 - acc: 0.08 - ETA: 1s - loss: 14.8140 - acc: 0.08 - ETA: 1s - loss: 14.8144 - acc: 0.08 - ETA: 1s - loss: 14.8056 - acc: 0.08 - ETA: 1s - loss: 14.8018 - acc: 0.08 - ETA: 1s - loss: 14.8025 - acc: 0.08 - ETA: 1s - loss: 14.7947 - acc: 0.08 - ETA: 1s - loss: 14.7873 - acc: 0.08 - ETA: 1s - loss: 14.7561 - acc: 0.08 - ETA: 1s - loss: 14.7736 - acc: 0.08 - ETA: 1s - loss: 14.7749 - acc: 0.08 - ETA: 1s - loss: 14.7837 - acc: 0.08 - ETA: 1s - loss: 14.7884 - acc: 0.08 - ETA: 1s - loss: 14.7785 - acc: 0.08 - ETA: 1s - loss: 14.7691 - acc: 0.08 - ETA: 1s - loss: 14.7772 - acc: 0.08 - ETA: 1s - loss: 14.7850 - acc: 0.08 - ETA: 0s - loss: 14.7892 - acc: 0.08 - ETA: 0s - loss: 14.7706 - acc: 0.08 - ETA: 0s - loss: 14.7686 - acc: 0.08 - ETA: 0s - loss: 14.7729 - acc: 0.08 - ETA: 0s - loss: 14.7830 - acc: 0.08 - ETA: 0s - loss: 14.7898 - acc: 0.08 - ETA: 0s - loss: 14.7876 - acc: 0.08 - ETA: 0s - loss: 14.7912 - acc: 0.08 - ETA: 0s - loss: 14.7778 - acc: 0.08 - ETA: 0s - loss: 14.7814 - acc: 0.08 - ETA: 0s - loss: 14.7822 - acc: 0.08 - ETA: 0s - loss: 14.7830 - acc: 0.08 - ETA: 0s - loss: 14.7811 - acc: 0.08 - ETA: 0s - loss: 14.7767 - acc: 0.08 - ETA: 0s - loss: 14.7800 - acc: 0.08 - ETA: 0s - loss: 14.7783 - acc: 0.08 - ETA: 0s - loss: 14.7766 - acc: 0.08 - ETA: 0s - loss: 14.7798 - acc: 0.08 - 4s 576us/step - loss: 14.7886 - acc: 0.0825 - val_loss: 14.8289 - val_acc: 0.0790 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 3s - loss: 14.3451 - acc: 0.11 - ETA: 3s - loss: 14.6675 - acc: 0.09 - ETA: 3s - loss: 14.5600 - acc: 0.09 - ETA: 3s - loss: 14.6675 - acc: 0.09 - ETA: 3s - loss: 14.6352 - acc: 0.09 - ETA: 3s - loss: 14.7212 - acc: 0.08 - ETA: 3s - loss: 14.8056 - acc: 0.08 - ETA: 3s - loss: 14.8689 - acc: 0.07 - ETA: 3s - loss: 14.8645 - acc: 0.07 - ETA: 3s - loss: 14.8448 - acc: 0.07 - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 14.8689 - acc: 0.07 - ETA: 2s - loss: 14.8782 - acc: 0.07 - ETA: 2s - loss: 14.8862 - acc: 0.07 - ETA: 2s - loss: 14.8716 - acc: 0.07 - ETA: 2s - loss: 14.8790 - acc: 0.07 - ETA: 2s - loss: 14.8666 - acc: 0.07 - ETA: 2s - loss: 14.8466 - acc: 0.07 - ETA: 2s - loss: 14.8371 - acc: 0.07 - ETA: 2s - loss: 14.8609 - acc: 0.07 - ETA: 2s - loss: 14.8517 - acc: 0.07 - ETA: 2s - loss: 14.8726 - acc: 0.07 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8421 - acc: 0.07 - ETA: 2s - loss: 14.8286 - acc: 0.08 - ETA: 2s - loss: 14.8534 - acc: 0.07 - ETA: 2s - loss: 14.8585 - acc: 0.07 - ETA: 2s - loss: 14.8574 - acc: 0.07 - ETA: 2s - loss: 14.8620 - acc: 0.07 - ETA: 2s - loss: 14.8716 - acc: 0.07 - ETA: 1s - loss: 14.8598 - acc: 0.07 - ETA: 1s - loss: 14.8740 - acc: 0.07 - ETA: 1s - loss: 14.8580 - acc: 0.07 - ETA: 1s - loss: 14.8381 - acc: 0.07 - ETA: 1s - loss: 14.8102 - acc: 0.08 - ETA: 1s - loss: 14.8242 - acc: 0.08 - ETA: 1s - loss: 14.8199 - acc: 0.08 - ETA: 1s - loss: 14.8202 - acc: 0.08 - ETA: 1s - loss: 14.8328 - acc: 0.07 - ETA: 1s - loss: 14.8327 - acc: 0.07 - ETA: 1s - loss: 14.8208 - acc: 0.08 - ETA: 1s - loss: 14.8286 - acc: 0.08 - ETA: 1s - loss: 14.8249 - acc: 0.08 - ETA: 1s - loss: 14.8140 - acc: 0.08 - ETA: 1s - loss: 14.8215 - acc: 0.08 - ETA: 1s - loss: 14.8041 - acc: 0.08 - ETA: 1s - loss: 14.8012 - acc: 0.08 - ETA: 1s - loss: 14.8186 - acc: 0.08 - ETA: 0s - loss: 14.8221 - acc: 0.08 - ETA: 0s - loss: 14.8415 - acc: 0.07 - ETA: 0s - loss: 14.8381 - acc: 0.07 - ETA: 0s - loss: 14.8132 - acc: 0.08 - ETA: 0s - loss: 14.8134 - acc: 0.08 - ETA: 0s - loss: 14.8107 - acc: 0.08 - ETA: 0s - loss: 14.8111 - acc: 0.08 - ETA: 0s - loss: 14.7970 - acc: 0.08 - ETA: 0s - loss: 14.7862 - acc: 0.08 - ETA: 0s - loss: 14.7870 - acc: 0.08 - ETA: 0s - loss: 14.7931 - acc: 0.08 - ETA: 0s - loss: 14.7991 - acc: 0.08 - ETA: 0s - loss: 14.7943 - acc: 0.08 - ETA: 0s - loss: 14.7949 - acc: 0.08 - ETA: 0s - loss: 14.7954 - acc: 0.08 - ETA: 0s - loss: 14.7884 - acc: 0.08 - ETA: 0s - loss: 14.7915 - acc: 0.08 - ETA: 0s - loss: 14.7945 - acc: 0.08 - 4s 574us/step - loss: 14.7886 - acc: 0.0825 - val_loss: 14.8289 - val_acc: 0.0790 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 3s - loss: 14.5063 - acc: 0.10 - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 14.7884 - acc: 0.08 - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 14.8555 - acc: 0.07 - ETA: 3s - loss: 14.8517 - acc: 0.07 - ETA: 3s - loss: 14.8689 - acc: 0.07 - ETA: 3s - loss: 14.8107 - acc: 0.08 - ETA: 3s - loss: 14.8448 - acc: 0.07 - ETA: 3s - loss: 14.7847 - acc: 0.08 - ETA: 3s - loss: 14.8152 - acc: 0.08 - ETA: 2s - loss: 14.8039 - acc: 0.08 - ETA: 2s - loss: 14.7711 - acc: 0.08 - ETA: 2s - loss: 14.7212 - acc: 0.08 - ETA: 2s - loss: 14.6876 - acc: 0.08 - ETA: 2s - loss: 14.6390 - acc: 0.09 - ETA: 2s - loss: 14.6496 - acc: 0.09 - ETA: 2s - loss: 14.6760 - acc: 0.08 - ETA: 2s - loss: 14.6675 - acc: 0.09 - ETA: 2s - loss: 14.7058 - acc: 0.08 - ETA: 2s - loss: 14.7334 - acc: 0.08 - ETA: 2s - loss: 14.7375 - acc: 0.08 - ETA: 2s - loss: 14.7212 - acc: 0.08 - ETA: 2s - loss: 14.7255 - acc: 0.08 - ETA: 2s - loss: 14.7295 - acc: 0.08 - ETA: 2s - loss: 14.7331 - acc: 0.08 - ETA: 2s - loss: 14.7596 - acc: 0.08 - ETA: 2s - loss: 14.7508 - acc: 0.08 - ETA: 2s - loss: 14.7642 - acc: 0.08 - ETA: 2s - loss: 14.7767 - acc: 0.08 - ETA: 1s - loss: 14.7732 - acc: 0.08 - ETA: 1s - loss: 14.7847 - acc: 0.08 - ETA: 1s - loss: 14.8002 - acc: 0.08 - ETA: 1s - loss: 14.8056 - acc: 0.08 - ETA: 1s - loss: 14.8107 - acc: 0.08 - ETA: 1s - loss: 14.8199 - acc: 0.08 - ETA: 1s - loss: 14.8202 - acc: 0.08 - ETA: 1s - loss: 14.8163 - acc: 0.08 - ETA: 1s - loss: 14.8206 - acc: 0.08 - ETA: 1s - loss: 14.7933 - acc: 0.08 - ETA: 1s - loss: 14.8018 - acc: 0.08 - ETA: 1s - loss: 14.8099 - acc: 0.08 - ETA: 1s - loss: 14.8213 - acc: 0.08 - ETA: 1s - loss: 14.8322 - acc: 0.07 - ETA: 1s - loss: 14.8322 - acc: 0.07 - ETA: 1s - loss: 14.8355 - acc: 0.07 - ETA: 1s - loss: 14.8488 - acc: 0.07 - ETA: 0s - loss: 14.8418 - acc: 0.07 - ETA: 0s - loss: 14.8319 - acc: 0.07 - ETA: 0s - loss: 14.8192 - acc: 0.08 - ETA: 0s - loss: 14.8132 - acc: 0.08 - ETA: 0s - loss: 14.8134 - acc: 0.08 - ETA: 0s - loss: 14.8197 - acc: 0.08 - ETA: 0s - loss: 14.8228 - acc: 0.08 - ETA: 0s - loss: 14.8200 - acc: 0.08 - ETA: 0s - loss: 14.8060 - acc: 0.08 - ETA: 0s - loss: 14.8064 - acc: 0.08 - ETA: 0s - loss: 14.7822 - acc: 0.08 - ETA: 0s - loss: 14.7857 - acc: 0.08 - ETA: 0s - loss: 14.7890 - acc: 0.08 - ETA: 0s - loss: 14.7871 - acc: 0.08 - ETA: 0s - loss: 14.7877 - acc: 0.08 - ETA: 0s - loss: 14.7858 - acc: 0.08 - ETA: 0s - loss: 14.7915 - acc: 0.08 - ETA: 0s - loss: 14.7945 - acc: 0.08 - 4s 576us/step - loss: 14.7886 - acc: 0.0825 - val_loss: 14.8289 - val_acc: 0.0790 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 4s - loss: 15.3122 - acc: 0.05 - ETA: 4s - loss: 14.9092 - acc: 0.07 - ETA: 3s - loss: 14.7212 - acc: 0.08 - ETA: 3s - loss: 14.5869 - acc: 0.09 - ETA: 3s - loss: 14.6675 - acc: 0.09 - ETA: 3s - loss: 14.6675 - acc: 0.09 - ETA: 3s - loss: 14.6214 - acc: 0.09 - ETA: 3s - loss: 14.5869 - acc: 0.09 - ETA: 3s - loss: 14.6317 - acc: 0.09 - ETA: 3s - loss: 14.6191 - acc: 0.09 - ETA: 3s - loss: 14.6528 - acc: 0.09 - ETA: 3s - loss: 14.6540 - acc: 0.09 - ETA: 3s - loss: 14.7047 - acc: 0.08 - ETA: 3s - loss: 14.7135 - acc: 0.08 - ETA: 2s - loss: 14.6997 - acc: 0.08 - ETA: 2s - loss: 14.6775 - acc: 0.08 - ETA: 2s - loss: 14.6580 - acc: 0.09 - ETA: 2s - loss: 14.6675 - acc: 0.09 - ETA: 2s - loss: 14.6420 - acc: 0.09 - ETA: 2s - loss: 14.6352 - acc: 0.09 - ETA: 2s - loss: 14.6751 - acc: 0.08 - ETA: 2s - loss: 14.6675 - acc: 0.09 - ETA: 2s - loss: 14.6955 - acc: 0.08 - ETA: 2s - loss: 14.7212 - acc: 0.08 - ETA: 2s - loss: 14.7190 - acc: 0.08 - ETA: 2s - loss: 14.7109 - acc: 0.08 - ETA: 2s - loss: 14.7272 - acc: 0.08 - ETA: 2s - loss: 14.7365 - acc: 0.08 - ETA: 2s - loss: 14.7286 - acc: 0.08 - ETA: 2s - loss: 14.7427 - acc: 0.08 - ETA: 2s - loss: 14.7351 - acc: 0.08 - ETA: 1s - loss: 14.7481 - acc: 0.08 - ETA: 1s - loss: 14.7505 - acc: 0.08 - ETA: 1s - loss: 14.7575 - acc: 0.08 - ETA: 1s - loss: 14.7504 - acc: 0.08 - ETA: 1s - loss: 14.7391 - acc: 0.08 - ETA: 1s - loss: 14.7502 - acc: 0.08 - ETA: 1s - loss: 14.7481 - acc: 0.08 - ETA: 1s - loss: 14.7543 - acc: 0.08 - ETA: 1s - loss: 14.7521 - acc: 0.08 - ETA: 1s - loss: 14.7579 - acc: 0.08 - ETA: 1s - loss: 14.7634 - acc: 0.08 - ETA: 1s - loss: 14.7537 - acc: 0.08 - ETA: 1s - loss: 14.7590 - acc: 0.08 - ETA: 1s - loss: 14.7713 - acc: 0.08 - ETA: 1s - loss: 14.7586 - acc: 0.08 - ETA: 1s - loss: 14.7532 - acc: 0.08 - ETA: 1s - loss: 14.7682 - acc: 0.08 - ETA: 0s - loss: 14.7596 - acc: 0.08 - ETA: 0s - loss: 14.7803 - acc: 0.08 - ETA: 0s - loss: 14.7876 - acc: 0.08 - ETA: 0s - loss: 14.7884 - acc: 0.08 - ETA: 0s - loss: 14.7800 - acc: 0.08 - ETA: 0s - loss: 14.7809 - acc: 0.08 - ETA: 0s - loss: 14.7847 - acc: 0.08 - ETA: 0s - loss: 14.7884 - acc: 0.08 - ETA: 0s - loss: 14.7919 - acc: 0.08 - ETA: 0s - loss: 14.7953 - acc: 0.08 - ETA: 0s - loss: 14.7904 - acc: 0.08 - ETA: 0s - loss: 14.7830 - acc: 0.08 - ETA: 0s - loss: 14.7890 - acc: 0.08 - ETA: 0s - loss: 14.7897 - acc: 0.08 - ETA: 0s - loss: 14.7954 - acc: 0.08 - ETA: 0s - loss: 14.7934 - acc: 0.08 - ETA: 0s - loss: 14.7865 - acc: 0.08 - ETA: 0s - loss: 14.7945 - acc: 0.08 - 4s 575us/step - loss: 14.7886 - acc: 0.0825 - val_loss: 14.8289 - val_acc: 0.0790 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 3s - loss: 15.1510 - acc: 0.06 - ETA: 3s - loss: 15.0704 - acc: 0.06 - ETA: 3s - loss: 15.0436 - acc: 0.06 - ETA: 3s - loss: 15.1510 - acc: 0.06 - ETA: 3s - loss: 14.8931 - acc: 0.07 - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 14.7365 - acc: 0.08 - ETA: 3s - loss: 14.7682 - acc: 0.08 - ETA: 3s - loss: 14.6675 - acc: 0.09 - ETA: 3s - loss: 14.5869 - acc: 0.09 - ETA: 3s - loss: 14.6089 - acc: 0.09 - ETA: 3s - loss: 14.6003 - acc: 0.09 - ETA: 2s - loss: 14.5807 - acc: 0.09 - ETA: 2s - loss: 14.5639 - acc: 0.09 - ETA: 2s - loss: 14.5923 - acc: 0.09 - ETA: 2s - loss: 14.6574 - acc: 0.09 - ETA: 2s - loss: 14.6295 - acc: 0.09 - ETA: 2s - loss: 14.6406 - acc: 0.09 - ETA: 2s - loss: 14.7014 - acc: 0.08 - ETA: 2s - loss: 14.6997 - acc: 0.08 - ETA: 2s - loss: 14.7289 - acc: 0.08 - ETA: 2s - loss: 14.7407 - acc: 0.08 - ETA: 2s - loss: 14.7446 - acc: 0.08 - ETA: 2s - loss: 14.7145 - acc: 0.08 - ETA: 2s - loss: 14.7255 - acc: 0.08 - ETA: 2s - loss: 14.7605 - acc: 0.08 - ETA: 2s - loss: 14.7749 - acc: 0.08 - ETA: 2s - loss: 14.7538 - acc: 0.08 - ETA: 2s - loss: 14.7397 - acc: 0.08 - ETA: 2s - loss: 14.7481 - acc: 0.08 - ETA: 2s - loss: 14.7559 - acc: 0.08 - ETA: 1s - loss: 14.7581 - acc: 0.08 - ETA: 1s - loss: 14.7798 - acc: 0.08 - ETA: 1s - loss: 14.7907 - acc: 0.08 - ETA: 1s - loss: 14.7872 - acc: 0.08 - ETA: 1s - loss: 14.7749 - acc: 0.08 - ETA: 1s - loss: 14.7720 - acc: 0.08 - ETA: 1s - loss: 14.7778 - acc: 0.08 - ETA: 1s - loss: 14.7791 - acc: 0.08 - ETA: 1s - loss: 14.7843 - acc: 0.08 - ETA: 1s - loss: 14.7815 - acc: 0.08 - ETA: 1s - loss: 14.7864 - acc: 0.08 - ETA: 1s - loss: 14.7912 - acc: 0.08 - ETA: 1s - loss: 14.7810 - acc: 0.08 - ETA: 1s - loss: 14.7821 - acc: 0.08 - ETA: 1s - loss: 14.7866 - acc: 0.08 - ETA: 1s - loss: 14.7875 - acc: 0.08 - ETA: 1s - loss: 14.7682 - acc: 0.08 - ETA: 1s - loss: 14.7727 - acc: 0.08 - ETA: 0s - loss: 14.7706 - acc: 0.08 - ETA: 0s - loss: 14.7654 - acc: 0.08 - ETA: 0s - loss: 14.7605 - acc: 0.08 - ETA: 0s - loss: 14.7678 - acc: 0.08 - ETA: 0s - loss: 14.7630 - acc: 0.08 - ETA: 0s - loss: 14.7583 - acc: 0.08 - ETA: 0s - loss: 14.7625 - acc: 0.08 - ETA: 0s - loss: 14.7636 - acc: 0.08 - ETA: 0s - loss: 14.7675 - acc: 0.08 - ETA: 0s - loss: 14.7767 - acc: 0.08 - ETA: 0s - loss: 14.7776 - acc: 0.08 - ETA: 0s - loss: 14.7705 - acc: 0.08 - ETA: 0s - loss: 14.7741 - acc: 0.08 - ETA: 0s - loss: 14.7800 - acc: 0.08 - ETA: 0s - loss: 14.7833 - acc: 0.08 - ETA: 0s - loss: 14.7915 - acc: 0.08 - ETA: 0s - loss: 14.7871 - acc: 0.08 - 4s 582us/step - loss: 14.7886 - acc: 0.0825 - val_loss: 14.8289 - val_acc: 0.0790 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 14.7749 - acc: 0.08 - ETA: 3s - loss: 14.8287 - acc: 0.08 - ETA: 3s - loss: 14.8287 - acc: 0.08 - ETA: 3s - loss: 14.8555 - acc: 0.07 - ETA: 3s - loss: 14.9438 - acc: 0.07 - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 15.0794 - acc: 0.06 - ETA: 3s - loss: 15.0382 - acc: 0.06 - ETA: 3s - loss: 15.0631 - acc: 0.06 - ETA: 3s - loss: 15.0301 - acc: 0.06 - ETA: 2s - loss: 15.0394 - acc: 0.06 - ETA: 2s - loss: 15.0359 - acc: 0.06 - ETA: 2s - loss: 15.0328 - acc: 0.06 - ETA: 2s - loss: 14.9697 - acc: 0.07 - ETA: 2s - loss: 14.9519 - acc: 0.07 - ETA: 2s - loss: 14.9540 - acc: 0.07 - ETA: 2s - loss: 14.9474 - acc: 0.07 - ETA: 2s - loss: 14.9254 - acc: 0.07 - ETA: 2s - loss: 14.8977 - acc: 0.07 - ETA: 2s - loss: 14.9019 - acc: 0.07 - ETA: 2s - loss: 14.8917 - acc: 0.07 - ETA: 2s - loss: 14.9092 - acc: 0.07 - ETA: 2s - loss: 14.9189 - acc: 0.07 - ETA: 2s - loss: 14.9464 - acc: 0.07 - ETA: 2s - loss: 14.9361 - acc: 0.07 - ETA: 2s - loss: 14.9265 - acc: 0.07 - ETA: 2s - loss: 14.9287 - acc: 0.07 - ETA: 2s - loss: 14.9146 - acc: 0.07 - ETA: 1s - loss: 14.9222 - acc: 0.07 - ETA: 1s - loss: 14.9395 - acc: 0.07 - ETA: 1s - loss: 14.9508 - acc: 0.07 - ETA: 1s - loss: 14.9329 - acc: 0.07 - ETA: 1s - loss: 14.9392 - acc: 0.07 - ETA: 1s - loss: 14.9271 - acc: 0.07 - ETA: 1s - loss: 14.9288 - acc: 0.07 - ETA: 1s - loss: 14.8965 - acc: 0.07 - ETA: 1s - loss: 14.8906 - acc: 0.07 - ETA: 1s - loss: 14.8810 - acc: 0.07 - ETA: 1s - loss: 14.8719 - acc: 0.07 - ETA: 1s - loss: 14.8747 - acc: 0.07 - ETA: 1s - loss: 14.8811 - acc: 0.07 - ETA: 1s - loss: 14.8799 - acc: 0.07 - ETA: 1s - loss: 14.8645 - acc: 0.07 - ETA: 1s - loss: 14.8742 - acc: 0.07 - ETA: 1s - loss: 14.8561 - acc: 0.07 - ETA: 1s - loss: 14.8522 - acc: 0.07 - ETA: 0s - loss: 14.8385 - acc: 0.07 - ETA: 0s - loss: 14.8351 - acc: 0.07 - ETA: 0s - loss: 14.8350 - acc: 0.07 - ETA: 0s - loss: 14.8441 - acc: 0.07 - ETA: 0s - loss: 14.8378 - acc: 0.07 - ETA: 0s - loss: 14.8376 - acc: 0.07 - ETA: 0s - loss: 14.8228 - acc: 0.08 - ETA: 0s - loss: 14.8229 - acc: 0.08 - ETA: 0s - loss: 14.8315 - acc: 0.07 - ETA: 0s - loss: 14.8259 - acc: 0.08 - ETA: 0s - loss: 14.8177 - acc: 0.08 - ETA: 0s - loss: 14.8152 - acc: 0.08 - ETA: 0s - loss: 14.8128 - acc: 0.08 - ETA: 0s - loss: 14.8079 - acc: 0.08 - ETA: 0s - loss: 14.8005 - acc: 0.08 - ETA: 0s - loss: 14.8035 - acc: 0.08 - ETA: 0s - loss: 14.8088 - acc: 0.08 - ETA: 0s - loss: 14.7920 - acc: 0.08 - 4s 573us/step - loss: 14.7886 - acc: 0.0825 - val_loss: 14.8289 - val_acc: 0.0790 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 3s - loss: 14.8286 - acc: 0.08 - ETA: 3s - loss: 15.2316 - acc: 0.05 - ETA: 3s - loss: 15.1510 - acc: 0.06 - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 15.0543 - acc: 0.06 - ETA: 3s - loss: 14.9898 - acc: 0.07 - ETA: 3s - loss: 14.9208 - acc: 0.07 - ETA: 3s - loss: 14.8689 - acc: 0.07 - ETA: 3s - loss: 14.7928 - acc: 0.08 - ETA: 3s - loss: 14.7642 - acc: 0.08 - ETA: 3s - loss: 14.8140 - acc: 0.08 - ETA: 3s - loss: 14.8689 - acc: 0.07 - ETA: 2s - loss: 14.8410 - acc: 0.07 - ETA: 2s - loss: 14.8171 - acc: 0.08 - ETA: 2s - loss: 14.8072 - acc: 0.08 - ETA: 2s - loss: 14.8085 - acc: 0.08 - ETA: 2s - loss: 14.8097 - acc: 0.08 - ETA: 2s - loss: 14.8376 - acc: 0.07 - ETA: 2s - loss: 14.7947 - acc: 0.08 - ETA: 2s - loss: 14.7964 - acc: 0.08 - ETA: 2s - loss: 14.8056 - acc: 0.08 - ETA: 2s - loss: 14.8140 - acc: 0.08 - ETA: 2s - loss: 14.7936 - acc: 0.08 - ETA: 2s - loss: 14.7615 - acc: 0.08 - ETA: 2s - loss: 14.7964 - acc: 0.08 - ETA: 2s - loss: 14.7915 - acc: 0.08 - ETA: 2s - loss: 14.7690 - acc: 0.08 - ETA: 2s - loss: 14.7711 - acc: 0.08 - ETA: 2s - loss: 14.7731 - acc: 0.08 - ETA: 2s - loss: 14.7857 - acc: 0.08 - ETA: 1s - loss: 14.7715 - acc: 0.08 - ETA: 1s - loss: 14.7682 - acc: 0.08 - ETA: 1s - loss: 14.7945 - acc: 0.08 - ETA: 1s - loss: 14.8049 - acc: 0.08 - ETA: 1s - loss: 14.7964 - acc: 0.08 - ETA: 1s - loss: 14.7973 - acc: 0.08 - ETA: 1s - loss: 14.8025 - acc: 0.08 - ETA: 1s - loss: 14.7862 - acc: 0.08 - ETA: 1s - loss: 14.7873 - acc: 0.08 - ETA: 1s - loss: 14.7884 - acc: 0.08 - ETA: 1s - loss: 14.7775 - acc: 0.08 - ETA: 1s - loss: 14.7941 - acc: 0.08 - ETA: 1s - loss: 14.7987 - acc: 0.08 - ETA: 1s - loss: 14.7920 - acc: 0.08 - ETA: 1s - loss: 14.8036 - acc: 0.08 - ETA: 1s - loss: 14.8006 - acc: 0.08 - ETA: 1s - loss: 14.7944 - acc: 0.08 - ETA: 1s - loss: 14.8018 - acc: 0.08 - ETA: 0s - loss: 14.8056 - acc: 0.08 - ETA: 0s - loss: 14.8029 - acc: 0.08 - ETA: 0s - loss: 14.8034 - acc: 0.08 - ETA: 0s - loss: 14.8101 - acc: 0.08 - ETA: 0s - loss: 14.8043 - acc: 0.08 - ETA: 0s - loss: 14.8048 - acc: 0.08 - ETA: 0s - loss: 14.8023 - acc: 0.08 - ETA: 0s - loss: 14.8114 - acc: 0.08 - ETA: 0s - loss: 14.8145 - acc: 0.08 - ETA: 0s - loss: 14.8231 - acc: 0.08 - ETA: 0s - loss: 14.8232 - acc: 0.08 - ETA: 0s - loss: 14.8233 - acc: 0.08 - ETA: 0s - loss: 14.8234 - acc: 0.08 - ETA: 0s - loss: 14.8157 - acc: 0.08 - ETA: 0s - loss: 14.8107 - acc: 0.08 - ETA: 0s - loss: 14.7984 - acc: 0.08 - ETA: 0s - loss: 14.7915 - acc: 0.08 - ETA: 0s - loss: 14.7896 - acc: 0.08 - 4s 573us/step - loss: 14.7886 - acc: 0.0825 - val_loss: 14.8289 - val_acc: 0.0790 Epoch 00020: val_loss did not improve we are at VGG19_model1 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 35s - loss: 15.5872 - acc: 0.010 - ETA: 9s - loss: 15.8218 - acc: 0.007 - ETA: 5s - loss: 15.6720 - acc: 0.01 - ETA: 3s - loss: 15.5689 - acc: 0.02 - ETA: 3s - loss: 15.5962 - acc: 0.02 - ETA: 2s - loss: 15.5938 - acc: 0.02 - ETA: 2s - loss: 15.6054 - acc: 0.02 - ETA: 1s - loss: 15.5542 - acc: 0.02 - ETA: 1s - loss: 15.5636 - acc: 0.02 - ETA: 1s - loss: 15.5657 - acc: 0.02 - ETA: 1s - loss: 15.5471 - acc: 0.02 - ETA: 1s - loss: 15.5698 - acc: 0.02 - ETA: 0s - loss: 15.5796 - acc: 0.02 - ETA: 0s - loss: 15.5707 - acc: 0.02 - ETA: 0s - loss: 15.5553 - acc: 0.02 - ETA: 0s - loss: 15.5513 - acc: 0.02 - ETA: 0s - loss: 15.5243 - acc: 0.03 - ETA: 0s - loss: 15.5235 - acc: 0.03 - ETA: 0s - loss: 15.5003 - acc: 0.03 - ETA: 0s - loss: 15.4776 - acc: 0.03 - ETA: 0s - loss: 15.4631 - acc: 0.03 - ETA: 0s - loss: 15.4531 - acc: 0.03 - 2s 272us/step - loss: 15.4457 - acc: 0.0350 - val_loss: 15.0151 - val_acc: 0.0599 Epoch 00001: val_loss improved from inf to 15.01505, saving model to saved_models/weights.best.VGG191.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 1s - loss: 15.4960 - acc: 0.03 - ETA: 1s - loss: 15.0343 - acc: 0.06 - ETA: 1s - loss: 14.9523 - acc: 0.06 - ETA: 1s - loss: 14.7847 - acc: 0.07 - ETA: 1s - loss: 14.8730 - acc: 0.07 - ETA: 0s - loss: 14.9302 - acc: 0.06 - ETA: 0s - loss: 14.9033 - acc: 0.06 - ETA: 0s - loss: 14.8655 - acc: 0.06 - ETA: 0s - loss: 14.8994 - acc: 0.06 - ETA: 0s - loss: 14.9167 - acc: 0.06 - ETA: 0s - loss: 14.8927 - acc: 0.06 - ETA: 0s - loss: 14.8916 - acc: 0.06 - ETA: 0s - loss: 14.9012 - acc: 0.06 - ETA: 0s - loss: 14.8875 - acc: 0.07 - ETA: 0s - loss: 14.8778 - acc: 0.07 - ETA: 0s - loss: 14.8724 - acc: 0.07 - ETA: 0s - loss: 14.8798 - acc: 0.07 - ETA: 0s - loss: 14.8831 - acc: 0.07 - ETA: 0s - loss: 14.8717 - acc: 0.07 - ETA: 0s - loss: 14.8899 - acc: 0.07 - ETA: 0s - loss: 14.8766 - acc: 0.07 - ETA: 0s - loss: 14.8679 - acc: 0.07 - 1s 191us/step - loss: 14.8483 - acc: 0.0728 - val_loss: 14.9823 - val_acc: 0.0647 Epoch 00002: val_loss improved from 15.01505 to 14.98229, saving model to saved_models/weights.best.VGG191.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 1s - loss: 14.0227 - acc: 0.13 - ETA: 1s - loss: 14.7932 - acc: 0.07 - ETA: 1s - loss: 14.7925 - acc: 0.07 - ETA: 1s - loss: 14.7840 - acc: 0.07 - ETA: 0s - loss: 14.7810 - acc: 0.07 - ETA: 0s - loss: 14.6893 - acc: 0.08 - ETA: 0s - loss: 14.7097 - acc: 0.08 - ETA: 0s - loss: 14.7491 - acc: 0.08 - ETA: 0s - loss: 14.7006 - acc: 0.08 - ETA: 0s - loss: 14.6685 - acc: 0.08 - ETA: 0s - loss: 14.6861 - acc: 0.08 - ETA: 0s - loss: 14.6985 - acc: 0.08 - ETA: 0s - loss: 14.6808 - acc: 0.08 - ETA: 0s - loss: 14.6440 - acc: 0.08 - ETA: 0s - loss: 14.6021 - acc: 0.08 - ETA: 0s - loss: 14.5931 - acc: 0.08 - ETA: 0s - loss: 14.5782 - acc: 0.09 - ETA: 0s - loss: 14.5627 - acc: 0.09 - ETA: 0s - loss: 14.5687 - acc: 0.09 - ETA: 0s - loss: 14.5623 - acc: 0.09 - ETA: 0s - loss: 14.5800 - acc: 0.09 - ETA: 0s - loss: 14.6008 - acc: 0.08 - 1s 185us/step - loss: 14.5946 - acc: 0.0894 - val_loss: 14.6926 - val_acc: 0.0766 Epoch 00003: val_loss improved from 14.98229 to 14.69264, saving model to saved_models/weights.best.VGG191.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 1s - loss: 14.5505 - acc: 0.09 - ETA: 1s - loss: 14.5370 - acc: 0.09 - ETA: 1s - loss: 14.5017 - acc: 0.09 - ETA: 0s - loss: 14.4418 - acc: 0.09 - ETA: 0s - loss: 14.4151 - acc: 0.10 - ETA: 0s - loss: 14.2918 - acc: 0.10 - ETA: 0s - loss: 14.2527 - acc: 0.11 - ETA: 0s - loss: 14.2730 - acc: 0.11 - ETA: 0s - loss: 14.3060 - acc: 0.10 - ETA: 0s - loss: 14.3426 - acc: 0.10 - ETA: 0s - loss: 14.3255 - acc: 0.10 - ETA: 0s - loss: 14.2979 - acc: 0.10 - ETA: 0s - loss: 14.3344 - acc: 0.10 - ETA: 0s - loss: 14.3374 - acc: 0.10 - ETA: 0s - loss: 14.3653 - acc: 0.10 - ETA: 0s - loss: 14.3766 - acc: 0.10 - ETA: 0s - loss: 14.3588 - acc: 0.10 - ETA: 0s - loss: 14.3714 - acc: 0.10 - ETA: 0s - loss: 14.3602 - acc: 0.10 - ETA: 0s - loss: 14.3535 - acc: 0.10 - ETA: 0s - loss: 14.3566 - acc: 0.10 - ETA: 0s - loss: 14.3520 - acc: 0.10 - 1s 184us/step - loss: 14.3472 - acc: 0.1046 - val_loss: 14.4163 - val_acc: 0.0994 Epoch 00004: val_loss improved from 14.69264 to 14.41634, saving model to saved_models/weights.best.VGG191.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 1s - loss: 13.7004 - acc: 0.15 - ETA: 1s - loss: 14.4231 - acc: 0.10 - ETA: 1s - loss: 14.3005 - acc: 0.10 - ETA: 0s - loss: 14.3214 - acc: 0.10 - ETA: 0s - loss: 14.2805 - acc: 0.10 - ETA: 0s - loss: 14.1836 - acc: 0.11 - ETA: 0s - loss: 14.1689 - acc: 0.11 - ETA: 0s - loss: 14.1826 - acc: 0.11 - ETA: 0s - loss: 14.1008 - acc: 0.12 - ETA: 0s - loss: 14.1073 - acc: 0.12 - ETA: 0s - loss: 14.1284 - acc: 0.11 - ETA: 0s - loss: 14.1126 - acc: 0.11 - ETA: 0s - loss: 14.1071 - acc: 0.11 - ETA: 0s - loss: 14.0718 - acc: 0.12 - ETA: 0s - loss: 14.0780 - acc: 0.12 - ETA: 0s - loss: 14.0923 - acc: 0.12 - ETA: 0s - loss: 14.0784 - acc: 0.12 - ETA: 0s - loss: 14.1062 - acc: 0.12 - ETA: 0s - loss: 14.1114 - acc: 0.11 - ETA: 0s - loss: 14.1235 - acc: 0.11 - ETA: 0s - loss: 14.1171 - acc: 0.11 - ETA: 0s - loss: 14.1154 - acc: 0.11 - 1s 186us/step - loss: 14.1217 - acc: 0.1190 - val_loss: 14.3060 - val_acc: 0.1054 Epoch 00005: val_loss improved from 14.41634 to 14.30599, saving model to saved_models/weights.best.VGG191.hdf5 Epoch 6/20 6680/6680 [==============================] - ETA: 1s - loss: 13.7216 - acc: 0.14 - ETA: 1s - loss: 13.9348 - acc: 0.13 - ETA: 1s - loss: 13.8436 - acc: 0.13 - ETA: 0s - loss: 13.9461 - acc: 0.13 - ETA: 0s - loss: 14.0101 - acc: 0.12 - ETA: 0s - loss: 14.0148 - acc: 0.12 - ETA: 0s - loss: 13.9909 - acc: 0.13 - ETA: 0s - loss: 13.9886 - acc: 0.13 - ETA: 0s - loss: 13.9883 - acc: 0.13 - ETA: 0s - loss: 14.0276 - acc: 0.12 - ETA: 0s - loss: 14.0821 - acc: 0.12 - ETA: 0s - loss: 14.1268 - acc: 0.12 - ETA: 0s - loss: 14.0941 - acc: 0.12 - ETA: 0s - loss: 14.0888 - acc: 0.12 - ETA: 0s - loss: 14.0809 - acc: 0.12 - ETA: 0s - loss: 14.0524 - acc: 0.12 - ETA: 0s - loss: 14.0637 - acc: 0.12 - ETA: 0s - loss: 14.0548 - acc: 0.12 - ETA: 0s - loss: 14.0238 - acc: 0.12 - ETA: 0s - loss: 13.9996 - acc: 0.12 - ETA: 0s - loss: 13.9976 - acc: 0.12 - ETA: 0s - loss: 14.0194 - acc: 0.12 - 1s 189us/step - loss: 14.0187 - acc: 0.1280 - val_loss: 14.1358 - val_acc: 0.1174 Epoch 00006: val_loss improved from 14.30599 to 14.13584, saving model to saved_models/weights.best.VGG191.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 1s - loss: 14.5522 - acc: 0.09 - ETA: 1s - loss: 13.9980 - acc: 0.12 - ETA: 1s - loss: 13.9625 - acc: 0.13 - ETA: 0s - loss: 14.0001 - acc: 0.13 - ETA: 0s - loss: 14.0348 - acc: 0.12 - ETA: 0s - loss: 14.0772 - acc: 0.12 - ETA: 0s - loss: 14.0053 - acc: 0.12 - ETA: 0s - loss: 14.0163 - acc: 0.12 - ETA: 0s - loss: 13.9164 - acc: 0.13 - ETA: 0s - loss: 13.8996 - acc: 0.13 - ETA: 0s - loss: 13.8919 - acc: 0.13 - ETA: 0s - loss: 13.8921 - acc: 0.13 - ETA: 0s - loss: 13.8965 - acc: 0.13 - ETA: 0s - loss: 13.9159 - acc: 0.13 - ETA: 0s - loss: 13.9489 - acc: 0.13 - ETA: 0s - loss: 13.9482 - acc: 0.13 - ETA: 0s - loss: 13.9620 - acc: 0.13 - ETA: 0s - loss: 13.9597 - acc: 0.13 - ETA: 0s - loss: 13.9395 - acc: 0.13 - ETA: 0s - loss: 13.9134 - acc: 0.13 - ETA: 0s - loss: 13.9012 - acc: 0.13 - ETA: 0s - loss: 13.9028 - acc: 0.13 - 1s 186us/step - loss: 13.9135 - acc: 0.1340 - val_loss: 14.0486 - val_acc: 0.1222 Epoch 00007: val_loss improved from 14.13584 to 14.04863, saving model to saved_models/weights.best.VGG191.hdf5 Epoch 8/20 6680/6680 [==============================] - ETA: 1s - loss: 14.6675 - acc: 0.09 - ETA: 1s - loss: 13.5043 - acc: 0.15 - ETA: 0s - loss: 13.7592 - acc: 0.14 - ETA: 0s - loss: 13.8208 - acc: 0.14 - ETA: 0s - loss: 13.8220 - acc: 0.14 - ETA: 0s - loss: 13.7816 - acc: 0.14 - ETA: 0s - loss: 13.8789 - acc: 0.13 - ETA: 0s - loss: 13.9191 - acc: 0.13 - ETA: 0s - loss: 13.8618 - acc: 0.13 - ETA: 0s - loss: 13.8767 - acc: 0.13 - ETA: 0s - loss: 13.8401 - acc: 0.13 - ETA: 0s - loss: 13.8065 - acc: 0.14 - ETA: 0s - loss: 13.7919 - acc: 0.14 - ETA: 0s - loss: 13.8481 - acc: 0.13 - ETA: 0s - loss: 13.8608 - acc: 0.13 - ETA: 0s - loss: 13.8690 - acc: 0.13 - ETA: 0s - loss: 13.8725 - acc: 0.13 - ETA: 0s - loss: 13.8773 - acc: 0.13 - ETA: 0s - loss: 13.8736 - acc: 0.13 - ETA: 0s - loss: 13.8849 - acc: 0.13 - ETA: 0s - loss: 13.8527 - acc: 0.13 - ETA: 0s - loss: 13.8501 - acc: 0.13 - 1s 186us/step - loss: 13.8521 - acc: 0.1374 - val_loss: 13.9849 - val_acc: 0.1281 Epoch 00008: val_loss improved from 14.04863 to 13.98488, saving model to saved_models/weights.best.VGG191.hdf5 Epoch 9/20 6680/6680 [==============================] - ETA: 1s - loss: 13.8616 - acc: 0.14 - ETA: 1s - loss: 14.2242 - acc: 0.11 - ETA: 1s - loss: 13.9552 - acc: 0.13 - ETA: 0s - loss: 14.0195 - acc: 0.12 - ETA: 0s - loss: 14.0703 - acc: 0.12 - ETA: 0s - loss: 14.1016 - acc: 0.12 - ETA: 0s - loss: 14.0104 - acc: 0.12 - ETA: 0s - loss: 13.9632 - acc: 0.13 - ETA: 0s - loss: 13.9029 - acc: 0.13 - ETA: 0s - loss: 13.9155 - acc: 0.13 - ETA: 0s - loss: 13.8767 - acc: 0.13 - ETA: 0s - loss: 13.8850 - acc: 0.13 - ETA: 0s - loss: 13.9105 - acc: 0.13 - ETA: 0s - loss: 13.9109 - acc: 0.13 - ETA: 0s - loss: 13.8812 - acc: 0.13 - ETA: 0s - loss: 13.9282 - acc: 0.13 - ETA: 0s - loss: 13.9106 - acc: 0.13 - ETA: 0s - loss: 13.8891 - acc: 0.13 - ETA: 0s - loss: 13.8792 - acc: 0.13 - ETA: 0s - loss: 13.8695 - acc: 0.13 - ETA: 0s - loss: 13.8297 - acc: 0.13 - ETA: 0s - loss: 13.8133 - acc: 0.14 - 1s 186us/step - loss: 13.8103 - acc: 0.1409 - val_loss: 14.1861 - val_acc: 0.1114 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 1s - loss: 13.0998 - acc: 0.17 - ETA: 1s - loss: 13.7790 - acc: 0.13 - ETA: 1s - loss: 13.4941 - acc: 0.15 - ETA: 0s - loss: 13.5721 - acc: 0.15 - ETA: 0s - loss: 13.6931 - acc: 0.14 - ETA: 0s - loss: 13.7326 - acc: 0.14 - ETA: 0s - loss: 13.7285 - acc: 0.14 - ETA: 0s - loss: 13.7699 - acc: 0.14 - ETA: 0s - loss: 13.7970 - acc: 0.13 - ETA: 0s - loss: 13.8065 - acc: 0.13 - ETA: 0s - loss: 13.7354 - acc: 0.14 - ETA: 0s - loss: 13.7471 - acc: 0.14 - ETA: 0s - loss: 13.6999 - acc: 0.14 - ETA: 0s - loss: 13.6666 - acc: 0.14 - ETA: 0s - loss: 13.6851 - acc: 0.14 - ETA: 0s - loss: 13.7002 - acc: 0.14 - ETA: 0s - loss: 13.6834 - acc: 0.14 - ETA: 0s - loss: 13.6883 - acc: 0.14 - ETA: 0s - loss: 13.6968 - acc: 0.14 - ETA: 0s - loss: 13.7032 - acc: 0.14 - ETA: 0s - loss: 13.6637 - acc: 0.14 - ETA: 0s - loss: 13.6706 - acc: 0.14 - 1s 189us/step - loss: 13.6650 - acc: 0.1481 - val_loss: 13.8560 - val_acc: 0.1329 Epoch 00010: val_loss improved from 13.98488 to 13.85605, saving model to saved_models/weights.best.VGG191.hdf5 Epoch 11/20 6680/6680 [==============================] - ETA: 1s - loss: 13.5392 - acc: 0.16 - ETA: 1s - loss: 13.3141 - acc: 0.17 - ETA: 1s - loss: 13.2902 - acc: 0.17 - ETA: 1s - loss: 13.4911 - acc: 0.16 - ETA: 0s - loss: 13.5890 - acc: 0.15 - ETA: 0s - loss: 13.5722 - acc: 0.15 - ETA: 0s - loss: 13.5500 - acc: 0.15 - ETA: 0s - loss: 13.5705 - acc: 0.15 - ETA: 0s - loss: 13.5952 - acc: 0.15 - ETA: 0s - loss: 13.6056 - acc: 0.15 - ETA: 0s - loss: 13.5907 - acc: 0.15 - ETA: 0s - loss: 13.5786 - acc: 0.15 - ETA: 0s - loss: 13.5719 - acc: 0.15 - ETA: 0s - loss: 13.5462 - acc: 0.15 - ETA: 0s - loss: 13.5207 - acc: 0.15 - ETA: 0s - loss: 13.5164 - acc: 0.15 - ETA: 0s - loss: 13.5533 - acc: 0.15 - ETA: 0s - loss: 13.5541 - acc: 0.15 - ETA: 0s - loss: 13.5943 - acc: 0.15 - ETA: 0s - loss: 13.5928 - acc: 0.15 - ETA: 0s - loss: 13.6192 - acc: 0.15 - ETA: 0s - loss: 13.6281 - acc: 0.15 - 1s 188us/step - loss: 13.6094 - acc: 0.1537 - val_loss: 13.7689 - val_acc: 0.1365 Epoch 00011: val_loss improved from 13.85605 to 13.76886, saving model to saved_models/weights.best.VGG191.hdf5 Epoch 12/20 6680/6680 [==============================] - ETA: 1s - loss: 13.3780 - acc: 0.17 - ETA: 1s - loss: 13.0960 - acc: 0.18 - ETA: 1s - loss: 13.1824 - acc: 0.18 - ETA: 0s - loss: 13.3055 - acc: 0.17 - ETA: 0s - loss: 13.2975 - acc: 0.17 - ETA: 0s - loss: 13.2785 - acc: 0.17 - ETA: 0s - loss: 13.3475 - acc: 0.16 - ETA: 0s - loss: 13.4323 - acc: 0.16 - ETA: 0s - loss: 13.4538 - acc: 0.16 - ETA: 0s - loss: 13.5148 - acc: 0.15 - ETA: 0s - loss: 13.5379 - acc: 0.15 - ETA: 0s - loss: 13.5667 - acc: 0.15 - ETA: 0s - loss: 13.5340 - acc: 0.15 - ETA: 0s - loss: 13.5540 - acc: 0.15 - ETA: 0s - loss: 13.5605 - acc: 0.15 - ETA: 0s - loss: 13.5486 - acc: 0.15 - ETA: 0s - loss: 13.5152 - acc: 0.16 - ETA: 0s - loss: 13.5243 - acc: 0.15 - ETA: 0s - loss: 13.5398 - acc: 0.15 - ETA: 0s - loss: 13.5502 - acc: 0.15 - ETA: 0s - loss: 13.5734 - acc: 0.15 - ETA: 0s - loss: 13.5839 - acc: 0.15 - 1s 186us/step - loss: 13.5903 - acc: 0.1551 - val_loss: 13.8004 - val_acc: 0.1353 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 1s - loss: 13.7004 - acc: 0.15 - ETA: 1s - loss: 13.5795 - acc: 0.15 - ETA: 1s - loss: 13.8848 - acc: 0.13 - ETA: 0s - loss: 13.8991 - acc: 0.13 - ETA: 0s - loss: 13.9280 - acc: 0.13 - ETA: 0s - loss: 13.8853 - acc: 0.13 - ETA: 0s - loss: 13.7883 - acc: 0.14 - ETA: 0s - loss: 13.8317 - acc: 0.14 - ETA: 0s - loss: 13.7852 - acc: 0.14 - ETA: 0s - loss: 13.7683 - acc: 0.14 - ETA: 0s - loss: 13.7513 - acc: 0.14 - ETA: 0s - loss: 13.7563 - acc: 0.14 - ETA: 0s - loss: 13.7257 - acc: 0.14 - ETA: 0s - loss: 13.7480 - acc: 0.14 - ETA: 0s - loss: 13.7746 - acc: 0.14 - ETA: 0s - loss: 13.7523 - acc: 0.14 - ETA: 0s - loss: 13.7578 - acc: 0.14 - ETA: 0s - loss: 13.7295 - acc: 0.14 - ETA: 0s - loss: 13.7210 - acc: 0.14 - ETA: 0s - loss: 13.6939 - acc: 0.14 - ETA: 0s - loss: 13.6335 - acc: 0.15 - ETA: 0s - loss: 13.6089 - acc: 0.15 - 1s 186us/step - loss: 13.5669 - acc: 0.1576 - val_loss: 13.9561 - val_acc: 0.1293 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 1s - loss: 13.3780 - acc: 0.17 - ETA: 1s - loss: 13.4186 - acc: 0.16 - ETA: 1s - loss: 13.6350 - acc: 0.15 - ETA: 0s - loss: 13.6785 - acc: 0.15 - ETA: 0s - loss: 13.5720 - acc: 0.15 - ETA: 0s - loss: 13.6263 - acc: 0.15 - ETA: 0s - loss: 13.5193 - acc: 0.16 - ETA: 0s - loss: 13.5293 - acc: 0.16 - ETA: 0s - loss: 13.5753 - acc: 0.15 - ETA: 0s - loss: 13.6175 - acc: 0.15 - ETA: 0s - loss: 13.5995 - acc: 0.15 - ETA: 0s - loss: 13.6132 - acc: 0.15 - ETA: 0s - loss: 13.5680 - acc: 0.15 - ETA: 0s - loss: 13.5739 - acc: 0.15 - ETA: 0s - loss: 13.5790 - acc: 0.15 - ETA: 0s - loss: 13.5834 - acc: 0.15 - ETA: 0s - loss: 13.5675 - acc: 0.15 - ETA: 0s - loss: 13.5684 - acc: 0.15 - ETA: 0s - loss: 13.5260 - acc: 0.16 - ETA: 0s - loss: 13.5378 - acc: 0.15 - ETA: 0s - loss: 13.5458 - acc: 0.15 - ETA: 0s - loss: 13.5530 - acc: 0.15 - 1s 189us/step - loss: 13.5495 - acc: 0.1588 - val_loss: 13.8051 - val_acc: 0.1377 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 1s - loss: 13.0557 - acc: 0.19 - ETA: 1s - loss: 13.1765 - acc: 0.18 - ETA: 1s - loss: 13.2399 - acc: 0.17 - ETA: 0s - loss: 13.2491 - acc: 0.17 - ETA: 0s - loss: 13.3668 - acc: 0.17 - ETA: 0s - loss: 13.5200 - acc: 0.16 - ETA: 0s - loss: 13.4128 - acc: 0.16 - ETA: 0s - loss: 13.4301 - acc: 0.16 - ETA: 0s - loss: 13.4174 - acc: 0.16 - ETA: 0s - loss: 13.4456 - acc: 0.16 - ETA: 0s - loss: 13.4547 - acc: 0.16 - ETA: 0s - loss: 13.4669 - acc: 0.16 - ETA: 0s - loss: 13.4640 - acc: 0.16 - ETA: 0s - loss: 13.4858 - acc: 0.16 - ETA: 0s - loss: 13.5307 - acc: 0.16 - ETA: 0s - loss: 13.4892 - acc: 0.16 - ETA: 0s - loss: 13.4857 - acc: 0.16 - ETA: 0s - loss: 13.5358 - acc: 0.15 - ETA: 0s - loss: 13.5418 - acc: 0.15 - ETA: 0s - loss: 13.5361 - acc: 0.15 - ETA: 0s - loss: 13.5468 - acc: 0.15 - ETA: 0s - loss: 13.5641 - acc: 0.15 - 1s 186us/step - loss: 13.5482 - acc: 0.1590 - val_loss: 13.8399 - val_acc: 0.1365 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 1s - loss: 13.3780 - acc: 0.17 - ETA: 1s - loss: 13.6601 - acc: 0.15 - ETA: 1s - loss: 13.5392 - acc: 0.16 - ETA: 0s - loss: 13.5553 - acc: 0.15 - ETA: 0s - loss: 13.5764 - acc: 0.15 - ETA: 0s - loss: 13.5896 - acc: 0.15 - ETA: 0s - loss: 13.5816 - acc: 0.15 - ETA: 0s - loss: 13.5905 - acc: 0.15 - ETA: 0s - loss: 13.5521 - acc: 0.15 - ETA: 0s - loss: 13.5219 - acc: 0.16 - ETA: 0s - loss: 13.4976 - acc: 0.16 - ETA: 0s - loss: 13.4728 - acc: 0.16 - ETA: 0s - loss: 13.4913 - acc: 0.16 - ETA: 0s - loss: 13.5553 - acc: 0.15 - ETA: 0s - loss: 13.5093 - acc: 0.16 - ETA: 0s - loss: 13.5463 - acc: 0.15 - ETA: 0s - loss: 13.5393 - acc: 0.16 - ETA: 0s - loss: 13.5579 - acc: 0.15 - ETA: 0s - loss: 13.5363 - acc: 0.16 - ETA: 0s - loss: 13.5615 - acc: 0.15 - ETA: 0s - loss: 13.5393 - acc: 0.16 - ETA: 0s - loss: 13.5522 - acc: 0.15 - 1s 188us/step - loss: 13.5439 - acc: 0.1596 - val_loss: 13.8716 - val_acc: 0.1353 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 1s - loss: 12.1403 - acc: 0.24 - ETA: 1s - loss: 13.1896 - acc: 0.18 - ETA: 1s - loss: 13.3394 - acc: 0.17 - ETA: 0s - loss: 13.1253 - acc: 0.18 - ETA: 0s - loss: 13.2208 - acc: 0.17 - ETA: 0s - loss: 13.2805 - acc: 0.17 - ETA: 0s - loss: 13.2790 - acc: 0.17 - ETA: 0s - loss: 13.2925 - acc: 0.17 - ETA: 0s - loss: 13.3801 - acc: 0.16 - ETA: 0s - loss: 13.3856 - acc: 0.16 - ETA: 0s - loss: 13.3225 - acc: 0.17 - ETA: 0s - loss: 13.3559 - acc: 0.17 - ETA: 0s - loss: 13.3969 - acc: 0.16 - ETA: 0s - loss: 13.4196 - acc: 0.16 - ETA: 0s - loss: 13.4542 - acc: 0.16 - ETA: 0s - loss: 13.4317 - acc: 0.16 - ETA: 0s - loss: 13.4186 - acc: 0.16 - ETA: 0s - loss: 13.4813 - acc: 0.16 - ETA: 0s - loss: 13.4845 - acc: 0.16 - ETA: 0s - loss: 13.4929 - acc: 0.16 - ETA: 0s - loss: 13.5032 - acc: 0.16 - ETA: 0s - loss: 13.5200 - acc: 0.16 - 1s 188us/step - loss: 13.5444 - acc: 0.1596 - val_loss: 13.8612 - val_acc: 0.1341 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 1s - loss: 14.0276 - acc: 0.13 - ETA: 1s - loss: 13.3792 - acc: 0.17 - ETA: 1s - loss: 13.5859 - acc: 0.15 - ETA: 1s - loss: 13.5558 - acc: 0.15 - ETA: 0s - loss: 13.4528 - acc: 0.16 - ETA: 0s - loss: 13.4589 - acc: 0.16 - ETA: 0s - loss: 13.4886 - acc: 0.16 - ETA: 0s - loss: 13.4955 - acc: 0.16 - ETA: 0s - loss: 13.4878 - acc: 0.16 - ETA: 0s - loss: 13.5854 - acc: 0.15 - ETA: 0s - loss: 13.5758 - acc: 0.15 - ETA: 0s - loss: 13.5962 - acc: 0.15 - ETA: 0s - loss: 13.6003 - acc: 0.15 - ETA: 0s - loss: 13.6119 - acc: 0.15 - ETA: 0s - loss: 13.5880 - acc: 0.15 - ETA: 0s - loss: 13.6164 - acc: 0.15 - ETA: 0s - loss: 13.6117 - acc: 0.15 - ETA: 0s - loss: 13.5951 - acc: 0.15 - ETA: 0s - loss: 13.6067 - acc: 0.15 - ETA: 0s - loss: 13.6032 - acc: 0.15 - ETA: 0s - loss: 13.5552 - acc: 0.15 - ETA: 0s - loss: 13.5242 - acc: 0.16 - 1s 189us/step - loss: 13.5437 - acc: 0.1597 - val_loss: 13.8156 - val_acc: 0.1365 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 1s - loss: 14.0227 - acc: 0.13 - ETA: 1s - loss: 13.8213 - acc: 0.14 - ETA: 1s - loss: 13.7927 - acc: 0.14 - ETA: 0s - loss: 13.7738 - acc: 0.14 - ETA: 0s - loss: 13.7580 - acc: 0.14 - ETA: 0s - loss: 13.6436 - acc: 0.15 - ETA: 0s - loss: 13.6521 - acc: 0.15 - ETA: 0s - loss: 13.6744 - acc: 0.15 - ETA: 0s - loss: 13.6464 - acc: 0.15 - ETA: 0s - loss: 13.6464 - acc: 0.15 - ETA: 0s - loss: 13.6212 - acc: 0.15 - ETA: 0s - loss: 13.5820 - acc: 0.15 - ETA: 0s - loss: 13.5531 - acc: 0.15 - ETA: 0s - loss: 13.5325 - acc: 0.16 - ETA: 0s - loss: 13.5315 - acc: 0.16 - ETA: 0s - loss: 13.5663 - acc: 0.15 - ETA: 0s - loss: 13.5775 - acc: 0.15 - ETA: 0s - loss: 13.5723 - acc: 0.15 - ETA: 0s - loss: 13.5360 - acc: 0.16 - ETA: 0s - loss: 13.5553 - acc: 0.15 - ETA: 0s - loss: 13.5493 - acc: 0.15 - ETA: 0s - loss: 13.5315 - acc: 0.16 - 1s 186us/step - loss: 13.5457 - acc: 0.1594 - val_loss: 13.7987 - val_acc: 0.1401 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 1s - loss: 14.0227 - acc: 0.13 - ETA: 1s - loss: 14.0227 - acc: 0.13 - ETA: 1s - loss: 13.7464 - acc: 0.14 - ETA: 0s - loss: 13.7971 - acc: 0.14 - ETA: 0s - loss: 13.6508 - acc: 0.15 - ETA: 0s - loss: 13.4888 - acc: 0.16 - ETA: 0s - loss: 13.4798 - acc: 0.16 - ETA: 0s - loss: 13.4220 - acc: 0.16 - ETA: 0s - loss: 13.4554 - acc: 0.16 - ETA: 0s - loss: 13.4701 - acc: 0.16 - ETA: 0s - loss: 13.5080 - acc: 0.16 - ETA: 0s - loss: 13.5345 - acc: 0.16 - ETA: 0s - loss: 13.5392 - acc: 0.16 - ETA: 0s - loss: 13.5513 - acc: 0.15 - ETA: 0s - loss: 13.5542 - acc: 0.15 - ETA: 0s - loss: 13.5322 - acc: 0.16 - ETA: 0s - loss: 13.5425 - acc: 0.15 - ETA: 0s - loss: 13.5361 - acc: 0.16 - ETA: 0s - loss: 13.5392 - acc: 0.16 - ETA: 0s - loss: 13.5364 - acc: 0.16 - ETA: 0s - loss: 13.5418 - acc: 0.15 - ETA: 0s - loss: 13.5694 - acc: 0.15 - 1s 186us/step - loss: 13.5435 - acc: 0.1597 - val_loss: 13.7987 - val_acc: 0.1401 Epoch 00020: val_loss did not improve we are at VGG19_model2 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 35s - loss: 14.7386 - acc: 0.0000e+0 - ETA: 6s - loss: 14.2506 - acc: 0.0067 - ETA: 3s - loss: 13.7557 - acc: 0.02 - ETA: 2s - loss: 13.4725 - acc: 0.02 - ETA: 1s - loss: 13.2382 - acc: 0.03 - ETA: 1s - loss: 13.0374 - acc: 0.04 - ETA: 0s - loss: 12.8039 - acc: 0.05 - ETA: 0s - loss: 12.5015 - acc: 0.06 - ETA: 0s - loss: 12.2659 - acc: 0.07 - ETA: 0s - loss: 12.1448 - acc: 0.08 - ETA: 0s - loss: 11.9664 - acc: 0.09 - ETA: 0s - loss: 11.7658 - acc: 0.09 - ETA: 0s - loss: 11.6232 - acc: 0.10 - ETA: 0s - loss: 11.5005 - acc: 0.10 - 1s 205us/step - loss: 11.4801 - acc: 0.1108 - val_loss: 9.5627 - val_acc: 0.2000 Epoch 00001: val_loss improved from inf to 9.56269, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 0s - loss: 9.6379 - acc: 0.230 - ETA: 0s - loss: 8.9948 - acc: 0.258 - ETA: 0s - loss: 9.0001 - acc: 0.269 - ETA: 0s - loss: 8.9606 - acc: 0.274 - ETA: 0s - loss: 8.8428 - acc: 0.276 - ETA: 0s - loss: 8.9016 - acc: 0.272 - ETA: 0s - loss: 8.8547 - acc: 0.273 - ETA: 0s - loss: 8.7740 - acc: 0.278 - ETA: 0s - loss: 8.7519 - acc: 0.279 - ETA: 0s - loss: 8.6354 - acc: 0.287 - ETA: 0s - loss: 8.5724 - acc: 0.289 - ETA: 0s - loss: 8.5047 - acc: 0.295 - ETA: 0s - loss: 8.4511 - acc: 0.297 - ETA: 0s - loss: 8.4351 - acc: 0.299 - 1s 117us/step - loss: 8.4386 - acc: 0.2996 - val_loss: 7.9648 - val_acc: 0.3174 Epoch 00002: val_loss improved from 9.56269 to 7.96479, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 0s - loss: 7.0667 - acc: 0.450 - ETA: 0s - loss: 7.7731 - acc: 0.351 - ETA: 0s - loss: 7.6993 - acc: 0.362 - ETA: 0s - loss: 7.6646 - acc: 0.372 - ETA: 0s - loss: 7.5628 - acc: 0.380 - ETA: 0s - loss: 7.5396 - acc: 0.384 - ETA: 0s - loss: 7.4289 - acc: 0.394 - ETA: 0s - loss: 7.4223 - acc: 0.392 - ETA: 0s - loss: 7.4084 - acc: 0.393 - ETA: 0s - loss: 7.4083 - acc: 0.398 - ETA: 0s - loss: 7.3185 - acc: 0.403 - ETA: 0s - loss: 7.3187 - acc: 0.403 - ETA: 0s - loss: 7.2714 - acc: 0.406 - ETA: 0s - loss: 7.2553 - acc: 0.408 - 1s 116us/step - loss: 7.2556 - acc: 0.4084 - val_loss: 7.4639 - val_acc: 0.3677 Epoch 00003: val_loss improved from 7.96479 to 7.46385, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 0s - loss: 7.1467 - acc: 0.490 - ETA: 0s - loss: 6.8941 - acc: 0.475 - ETA: 0s - loss: 6.8766 - acc: 0.466 - ETA: 0s - loss: 6.8200 - acc: 0.465 - ETA: 0s - loss: 6.8500 - acc: 0.465 - ETA: 0s - loss: 6.7161 - acc: 0.470 - ETA: 0s - loss: 6.6532 - acc: 0.470 - ETA: 0s - loss: 6.6142 - acc: 0.469 - ETA: 0s - loss: 6.6465 - acc: 0.469 - ETA: 0s - loss: 6.6494 - acc: 0.470 - ETA: 0s - loss: 6.6418 - acc: 0.472 - ETA: 0s - loss: 6.6482 - acc: 0.472 - ETA: 0s - loss: 6.5883 - acc: 0.477 - ETA: 0s - loss: 6.5882 - acc: 0.477 - 1s 116us/step - loss: 6.5939 - acc: 0.4763 - val_loss: 6.9985 - val_acc: 0.4168 Epoch 00004: val_loss improved from 7.46385 to 6.99850, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 0s - loss: 5.9202 - acc: 0.510 - ETA: 0s - loss: 6.4614 - acc: 0.516 - ETA: 0s - loss: 6.4809 - acc: 0.519 - ETA: 0s - loss: 6.3761 - acc: 0.524 - ETA: 0s - loss: 6.4009 - acc: 0.527 - ETA: 0s - loss: 6.2697 - acc: 0.532 - ETA: 0s - loss: 6.3184 - acc: 0.528 - ETA: 0s - loss: 6.2679 - acc: 0.530 - ETA: 0s - loss: 6.2949 - acc: 0.529 - ETA: 0s - loss: 6.3128 - acc: 0.530 - ETA: 0s - loss: 6.2935 - acc: 0.528 - ETA: 0s - loss: 6.2769 - acc: 0.530 - ETA: 0s - loss: 6.3012 - acc: 0.530 - ETA: 0s - loss: 6.2743 - acc: 0.532 - 1s 116us/step - loss: 6.2681 - acc: 0.5326 - val_loss: 6.9740 - val_acc: 0.4263 Epoch 00005: val_loss improved from 6.99850 to 6.97402, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 6/20 6680/6680 [==============================] - ETA: 0s - loss: 7.1335 - acc: 0.510 - ETA: 0s - loss: 6.1064 - acc: 0.556 - ETA: 0s - loss: 6.0414 - acc: 0.563 - ETA: 0s - loss: 6.0145 - acc: 0.571 - ETA: 0s - loss: 6.0506 - acc: 0.570 - ETA: 0s - loss: 6.2038 - acc: 0.563 - ETA: 0s - loss: 6.1956 - acc: 0.560 - ETA: 0s - loss: 6.1952 - acc: 0.560 - ETA: 0s - loss: 6.2664 - acc: 0.556 - ETA: 0s - loss: 6.1727 - acc: 0.562 - ETA: 0s - loss: 6.1483 - acc: 0.564 - ETA: 0s - loss: 6.1450 - acc: 0.563 - ETA: 0s - loss: 6.0976 - acc: 0.566 - ETA: 0s - loss: 6.1013 - acc: 0.566 - 1s 119us/step - loss: 6.0927 - acc: 0.5675 - val_loss: 6.9525 - val_acc: 0.4359 Epoch 00006: val_loss improved from 6.97402 to 6.95248, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 0s - loss: 7.0955 - acc: 0.510 - ETA: 0s - loss: 6.1615 - acc: 0.575 - ETA: 0s - loss: 5.8993 - acc: 0.596 - ETA: 0s - loss: 5.9824 - acc: 0.583 - ETA: 0s - loss: 6.0100 - acc: 0.583 - ETA: 0s - loss: 6.1076 - acc: 0.576 - ETA: 0s - loss: 6.0783 - acc: 0.577 - ETA: 0s - loss: 6.1030 - acc: 0.574 - ETA: 0s - loss: 6.0272 - acc: 0.579 - ETA: 0s - loss: 6.0268 - acc: 0.580 - ETA: 0s - loss: 6.0463 - acc: 0.580 - ETA: 0s - loss: 6.0308 - acc: 0.582 - ETA: 0s - loss: 6.0336 - acc: 0.583 - ETA: 0s - loss: 6.0143 - acc: 0.584 - 1s 117us/step - loss: 6.0159 - acc: 0.5847 - val_loss: 6.8184 - val_acc: 0.4491 Epoch 00007: val_loss improved from 6.95248 to 6.81841, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 8/20 6680/6680 [==============================] - ETA: 0s - loss: 6.6313 - acc: 0.590 - ETA: 0s - loss: 5.9796 - acc: 0.606 - ETA: 0s - loss: 6.1093 - acc: 0.591 - ETA: 0s - loss: 5.9723 - acc: 0.603 - ETA: 0s - loss: 5.9910 - acc: 0.605 - ETA: 0s - loss: 5.9524 - acc: 0.608 - ETA: 0s - loss: 5.8604 - acc: 0.612 - ETA: 0s - loss: 5.9485 - acc: 0.606 - ETA: 0s - loss: 6.0090 - acc: 0.600 - ETA: 0s - loss: 5.9967 - acc: 0.600 - ETA: 0s - loss: 5.9660 - acc: 0.603 - ETA: 0s - loss: 5.9496 - acc: 0.604 - ETA: 0s - loss: 5.9669 - acc: 0.603 - ETA: 0s - loss: 5.9439 - acc: 0.603 - 1s 117us/step - loss: 5.9502 - acc: 0.6028 - val_loss: 6.7264 - val_acc: 0.4647 Epoch 00008: val_loss improved from 6.81841 to 6.72637, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 9/20 6680/6680 [==============================] - ETA: 0s - loss: 6.8322 - acc: 0.560 - ETA: 0s - loss: 5.5039 - acc: 0.645 - ETA: 0s - loss: 5.2608 - acc: 0.661 - ETA: 0s - loss: 5.3895 - acc: 0.653 - ETA: 0s - loss: 5.6299 - acc: 0.639 - ETA: 0s - loss: 5.6689 - acc: 0.631 - ETA: 0s - loss: 5.7020 - acc: 0.628 - ETA: 0s - loss: 5.7600 - acc: 0.626 - ETA: 0s - loss: 5.8421 - acc: 0.622 - ETA: 0s - loss: 5.8649 - acc: 0.620 - ETA: 0s - loss: 5.9302 - acc: 0.616 - ETA: 0s - loss: 5.9440 - acc: 0.615 - ETA: 0s - loss: 5.9374 - acc: 0.615 - ETA: 0s - loss: 5.9436 - acc: 0.614 - 1s 116us/step - loss: 5.9141 - acc: 0.6165 - val_loss: 6.7803 - val_acc: 0.4623 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 0s - loss: 5.4971 - acc: 0.660 - ETA: 0s - loss: 5.8801 - acc: 0.633 - ETA: 0s - loss: 5.8580 - acc: 0.631 - ETA: 0s - loss: 5.7944 - acc: 0.636 - ETA: 0s - loss: 5.9248 - acc: 0.628 - ETA: 0s - loss: 5.8921 - acc: 0.628 - ETA: 0s - loss: 5.8950 - acc: 0.628 - ETA: 0s - loss: 5.8556 - acc: 0.630 - ETA: 0s - loss: 5.8478 - acc: 0.630 - ETA: 0s - loss: 5.8443 - acc: 0.630 - ETA: 0s - loss: 5.8542 - acc: 0.627 - ETA: 0s - loss: 5.8647 - acc: 0.626 - ETA: 0s - loss: 5.9016 - acc: 0.624 - ETA: 0s - loss: 5.8923 - acc: 0.624 - 1s 115us/step - loss: 5.8853 - acc: 0.6244 - val_loss: 6.7352 - val_acc: 0.4731 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 0s - loss: 4.8786 - acc: 0.680 - ETA: 0s - loss: 5.7654 - acc: 0.636 - ETA: 0s - loss: 5.8643 - acc: 0.630 - ETA: 0s - loss: 5.9833 - acc: 0.625 - ETA: 0s - loss: 5.8491 - acc: 0.631 - ETA: 0s - loss: 5.7665 - acc: 0.635 - ETA: 0s - loss: 5.8800 - acc: 0.628 - ETA: 0s - loss: 5.8918 - acc: 0.627 - ETA: 0s - loss: 5.8773 - acc: 0.628 - ETA: 0s - loss: 5.8584 - acc: 0.629 - ETA: 0s - loss: 5.8561 - acc: 0.629 - ETA: 0s - loss: 5.8663 - acc: 0.627 - ETA: 0s - loss: 5.8835 - acc: 0.626 - ETA: 0s - loss: 5.8645 - acc: 0.627 - 1s 115us/step - loss: 5.8667 - acc: 0.6275 - val_loss: 6.7291 - val_acc: 0.4802 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 0s - loss: 5.0177 - acc: 0.690 - ETA: 0s - loss: 5.8711 - acc: 0.635 - ETA: 0s - loss: 5.9480 - acc: 0.630 - ETA: 0s - loss: 5.9956 - acc: 0.627 - ETA: 0s - loss: 5.7771 - acc: 0.639 - ETA: 0s - loss: 5.7891 - acc: 0.636 - ETA: 0s - loss: 5.7568 - acc: 0.639 - ETA: 0s - loss: 5.7749 - acc: 0.638 - ETA: 0s - loss: 5.7807 - acc: 0.635 - ETA: 0s - loss: 5.7910 - acc: 0.634 - ETA: 0s - loss: 5.8345 - acc: 0.631 - ETA: 0s - loss: 5.8068 - acc: 0.633 - ETA: 0s - loss: 5.8382 - acc: 0.631 - ETA: 0s - loss: 5.8424 - acc: 0.631 - 1s 115us/step - loss: 5.8572 - acc: 0.6304 - val_loss: 6.6819 - val_acc: 0.4838 Epoch 00012: val_loss improved from 6.72637 to 6.68194, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 13/20 6680/6680 [==============================] - ETA: 0s - loss: 5.9671 - acc: 0.630 - ETA: 0s - loss: 6.1842 - acc: 0.616 - ETA: 0s - loss: 6.1036 - acc: 0.620 - ETA: 0s - loss: 6.0731 - acc: 0.622 - ETA: 0s - loss: 5.9657 - acc: 0.628 - ETA: 0s - loss: 5.8664 - acc: 0.633 - ETA: 0s - loss: 5.7962 - acc: 0.637 - ETA: 0s - loss: 5.7944 - acc: 0.638 - ETA: 0s - loss: 5.7106 - acc: 0.643 - ETA: 0s - loss: 5.7157 - acc: 0.642 - ETA: 0s - loss: 5.7952 - acc: 0.637 - ETA: 0s - loss: 5.7907 - acc: 0.638 - ETA: 0s - loss: 5.8377 - acc: 0.634 - ETA: 0s - loss: 5.8469 - acc: 0.633 - 1s 115us/step - loss: 5.8497 - acc: 0.6334 - val_loss: 6.6984 - val_acc: 0.4790 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 0s - loss: 4.9970 - acc: 0.690 - ETA: 0s - loss: 5.6233 - acc: 0.646 - ETA: 0s - loss: 5.7940 - acc: 0.638 - ETA: 0s - loss: 5.8543 - acc: 0.633 - ETA: 0s - loss: 5.9185 - acc: 0.630 - ETA: 0s - loss: 5.9823 - acc: 0.625 - ETA: 0s - loss: 5.9405 - acc: 0.626 - ETA: 0s - loss: 5.8965 - acc: 0.629 - ETA: 0s - loss: 5.8907 - acc: 0.628 - ETA: 0s - loss: 5.8457 - acc: 0.630 - ETA: 0s - loss: 5.7888 - acc: 0.633 - ETA: 0s - loss: 5.7814 - acc: 0.633 - ETA: 0s - loss: 5.7861 - acc: 0.633 - ETA: 0s - loss: 5.8243 - acc: 0.630 - 1s 116us/step - loss: 5.8073 - acc: 0.6311 - val_loss: 6.6643 - val_acc: 0.4659 Epoch 00014: val_loss improved from 6.68194 to 6.66431, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 15/20 6680/6680 [==============================] - ETA: 0s - loss: 7.2003 - acc: 0.540 - ETA: 0s - loss: 5.9375 - acc: 0.615 - ETA: 0s - loss: 5.9754 - acc: 0.615 - ETA: 0s - loss: 6.0494 - acc: 0.610 - ETA: 0s - loss: 6.0719 - acc: 0.608 - ETA: 0s - loss: 5.9100 - acc: 0.619 - ETA: 0s - loss: 5.8366 - acc: 0.624 - ETA: 0s - loss: 5.8026 - acc: 0.627 - ETA: 0s - loss: 5.8018 - acc: 0.627 - ETA: 0s - loss: 5.7597 - acc: 0.628 - ETA: 0s - loss: 5.7040 - acc: 0.632 - ETA: 0s - loss: 5.7399 - acc: 0.630 - ETA: 0s - loss: 5.6912 - acc: 0.633 - ETA: 0s - loss: 5.6538 - acc: 0.636 - 1s 118us/step - loss: 5.6634 - acc: 0.6355 - val_loss: 6.5693 - val_acc: 0.4934 Epoch 00015: val_loss improved from 6.66431 to 6.56929, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 16/20 6680/6680 [==============================] - ETA: 0s - loss: 7.1751 - acc: 0.550 - ETA: 0s - loss: 6.1454 - acc: 0.611 - ETA: 0s - loss: 6.2138 - acc: 0.610 - ETA: 0s - loss: 6.1647 - acc: 0.612 - ETA: 0s - loss: 5.9249 - acc: 0.626 - ETA: 0s - loss: 5.7995 - acc: 0.633 - ETA: 0s - loss: 5.8042 - acc: 0.632 - ETA: 0s - loss: 5.7280 - acc: 0.637 - ETA: 0s - loss: 5.6924 - acc: 0.639 - ETA: 0s - loss: 5.6716 - acc: 0.640 - ETA: 0s - loss: 5.6623 - acc: 0.640 - ETA: 0s - loss: 5.6279 - acc: 0.642 - ETA: 0s - loss: 5.6533 - acc: 0.640 - ETA: 0s - loss: 5.6262 - acc: 0.642 - 1s 117us/step - loss: 5.6195 - acc: 0.6424 - val_loss: 6.5163 - val_acc: 0.4922 Epoch 00016: val_loss improved from 6.56929 to 6.51628, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 17/20 6680/6680 [==============================] - ETA: 0s - loss: 6.6357 - acc: 0.580 - ETA: 0s - loss: 6.0165 - acc: 0.620 - ETA: 0s - loss: 5.8941 - acc: 0.630 - ETA: 0s - loss: 5.7908 - acc: 0.636 - ETA: 0s - loss: 5.7975 - acc: 0.634 - ETA: 0s - loss: 5.8461 - acc: 0.631 - ETA: 0s - loss: 5.7709 - acc: 0.637 - ETA: 0s - loss: 5.7140 - acc: 0.638 - ETA: 0s - loss: 5.6653 - acc: 0.641 - ETA: 0s - loss: 5.5836 - acc: 0.646 - ETA: 0s - loss: 5.5858 - acc: 0.646 - ETA: 0s - loss: 5.5923 - acc: 0.646 - ETA: 0s - loss: 5.6100 - acc: 0.645 - ETA: 0s - loss: 5.6038 - acc: 0.646 - 1s 118us/step - loss: 5.6020 - acc: 0.6466 - val_loss: 6.5784 - val_acc: 0.4886 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 0s - loss: 5.9746 - acc: 0.630 - ETA: 0s - loss: 5.5054 - acc: 0.656 - ETA: 0s - loss: 5.8163 - acc: 0.634 - ETA: 0s - loss: 5.7345 - acc: 0.636 - ETA: 0s - loss: 5.7039 - acc: 0.638 - ETA: 0s - loss: 5.6709 - acc: 0.640 - ETA: 0s - loss: 5.6302 - acc: 0.641 - ETA: 0s - loss: 5.6307 - acc: 0.641 - ETA: 0s - loss: 5.5939 - acc: 0.643 - ETA: 0s - loss: 5.6238 - acc: 0.641 - ETA: 0s - loss: 5.5490 - acc: 0.646 - ETA: 0s - loss: 5.5473 - acc: 0.646 - ETA: 0s - loss: 5.5323 - acc: 0.646 - ETA: 0s - loss: 5.4925 - acc: 0.649 - 1s 115us/step - loss: 5.5112 - acc: 0.6481 - val_loss: 6.4285 - val_acc: 0.4934 Epoch 00018: val_loss improved from 6.51628 to 6.42850, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 19/20 6680/6680 [==============================] - ETA: 0s - loss: 5.8988 - acc: 0.610 - ETA: 0s - loss: 5.7916 - acc: 0.621 - ETA: 0s - loss: 5.4010 - acc: 0.652 - ETA: 0s - loss: 5.2532 - acc: 0.662 - ETA: 0s - loss: 5.3552 - acc: 0.656 - ETA: 0s - loss: 5.4423 - acc: 0.649 - ETA: 0s - loss: 5.4179 - acc: 0.650 - ETA: 0s - loss: 5.4619 - acc: 0.648 - ETA: 0s - loss: 5.4728 - acc: 0.647 - ETA: 0s - loss: 5.4359 - acc: 0.649 - ETA: 0s - loss: 5.3842 - acc: 0.653 - ETA: 0s - loss: 5.3717 - acc: 0.654 - ETA: 0s - loss: 5.3620 - acc: 0.655 - ETA: 0s - loss: 5.3936 - acc: 0.653 - 1s 115us/step - loss: 5.4027 - acc: 0.6528 - val_loss: 6.4260 - val_acc: 0.4886 Epoch 00019: val_loss improved from 6.42850 to 6.42601, saving model to saved_models/weights.best.VGG192.hdf5 Epoch 20/20 6680/6680 [==============================] - ETA: 0s - loss: 5.6800 - acc: 0.640 - ETA: 0s - loss: 5.1492 - acc: 0.675 - ETA: 0s - loss: 5.1986 - acc: 0.671 - ETA: 0s - loss: 5.1119 - acc: 0.675 - ETA: 0s - loss: 5.2525 - acc: 0.667 - ETA: 0s - loss: 5.2800 - acc: 0.666 - ETA: 0s - loss: 5.3197 - acc: 0.664 - ETA: 0s - loss: 5.3276 - acc: 0.664 - ETA: 0s - loss: 5.3250 - acc: 0.664 - ETA: 0s - loss: 5.4799 - acc: 0.653 - ETA: 0s - loss: 5.4344 - acc: 0.656 - ETA: 0s - loss: 5.4235 - acc: 0.657 - ETA: 0s - loss: 5.3854 - acc: 0.659 - ETA: 0s - loss: 5.3471 - acc: 0.661 - 1s 118us/step - loss: 5.3412 - acc: 0.6623 - val_loss: 6.3252 - val_acc: 0.4994 Epoch 00020: val_loss improved from 6.42601 to 6.32522, saving model to saved_models/weights.best.VGG192.hdf5 we are at VGG19_model3 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 35s - loss: 15.9569 - acc: 0.010 - ETA: 6s - loss: 15.9887 - acc: 0.006 - ETA: 3s - loss: 16.0035 - acc: 0.00 - ETA: 2s - loss: 16.0132 - acc: 0.00 - ETA: 1s - loss: 16.0152 - acc: 0.00 - ETA: 1s - loss: 16.0164 - acc: 0.00 - ETA: 0s - loss: 16.0120 - acc: 0.00 - ETA: 0s - loss: 16.0133 - acc: 0.00 - ETA: 0s - loss: 16.0143 - acc: 0.00 - ETA: 0s - loss: 16.0220 - acc: 0.00 - ETA: 0s - loss: 16.0157 - acc: 0.00 - ETA: 0s - loss: 16.0191 - acc: 0.00 - ETA: 0s - loss: 16.0140 - acc: 0.00 - ETA: 0s - loss: 16.0121 - acc: 0.00 - 1s 204us/step - loss: 16.0109 - acc: 0.0064 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00001: val_loss improved from inf to 16.00228, saving model to saved_models/weights.best.VGG193.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0106 - acc: 0.0067 - ETA: 0s - loss: 16.0155 - acc: 0.00 - ETA: 0s - loss: 16.0174 - acc: 0.00 - ETA: 0s - loss: 16.0183 - acc: 0.00 - ETA: 0s - loss: 16.0375 - acc: 0.00 - ETA: 0s - loss: 16.0141 - acc: 0.00 - ETA: 0s - loss: 16.0196 - acc: 0.00 - ETA: 0s - loss: 16.0120 - acc: 0.00 - ETA: 0s - loss: 16.0200 - acc: 0.00 - ETA: 0s - loss: 16.0106 - acc: 0.00 - ETA: 0s - loss: 16.0116 - acc: 0.00 - ETA: 0s - loss: 16.0124 - acc: 0.00 - ETA: 0s - loss: 16.0058 - acc: 0.00 - 1s 116us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00002: val_loss did not improve Epoch 3/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 15.9569 - acc: 0.0100 - ETA: 0s - loss: 15.9716 - acc: 0.00 - ETA: 0s - loss: 15.9871 - acc: 0.00 - ETA: 0s - loss: 16.0030 - acc: 0.00 - ETA: 0s - loss: 16.0003 - acc: 0.00 - ETA: 0s - loss: 16.0037 - acc: 0.00 - ETA: 0s - loss: 15.9972 - acc: 0.00 - ETA: 0s - loss: 15.9962 - acc: 0.00 - ETA: 0s - loss: 15.9955 - acc: 0.00 - ETA: 0s - loss: 15.9980 - acc: 0.00 - ETA: 0s - loss: 15.9972 - acc: 0.00 - ETA: 0s - loss: 16.0045 - acc: 0.00 - ETA: 0s - loss: 16.0082 - acc: 0.00 - 1s 116us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00003: val_loss did not improve Epoch 4/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 15.9838 - acc: 0.0083 - ETA: 0s - loss: 16.0155 - acc: 0.00 - ETA: 0s - loss: 16.0375 - acc: 0.00 - ETA: 0s - loss: 16.0413 - acc: 0.00 - ETA: 0s - loss: 16.0437 - acc: 0.00 - ETA: 0s - loss: 16.0297 - acc: 0.00 - ETA: 0s - loss: 16.0241 - acc: 0.00 - ETA: 0s - loss: 16.0159 - acc: 0.00 - ETA: 0s - loss: 16.0165 - acc: 0.00 - ETA: 0s - loss: 16.0138 - acc: 0.00 - ETA: 0s - loss: 16.0087 - acc: 0.00 - ETA: 0s - loss: 16.0098 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 117us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00004: val_loss did not improve Epoch 5/20 6680/6680 [==============================] - ETA: 0s - loss: 15.9569 - acc: 0.01 - ETA: 0s - loss: 15.9301 - acc: 0.01 - ETA: 0s - loss: 15.9716 - acc: 0.00 - ETA: 0s - loss: 15.9972 - acc: 0.00 - ETA: 0s - loss: 15.9799 - acc: 0.00 - ETA: 0s - loss: 15.9755 - acc: 0.00 - ETA: 0s - loss: 15.9881 - acc: 0.00 - ETA: 0s - loss: 16.0017 - acc: 0.00 - ETA: 0s - loss: 16.0159 - acc: 0.00 - ETA: 0s - loss: 16.0130 - acc: 0.00 - ETA: 0s - loss: 16.0106 - acc: 0.00 - ETA: 0s - loss: 16.0058 - acc: 0.00 - ETA: 0s - loss: 16.0098 - acc: 0.00 - ETA: 0s - loss: 16.0058 - acc: 0.00 - 1s 117us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00005: val_loss did not improve Epoch 6/20 6680/6680 [==============================] - ETA: 0s - loss: 15.9569 - acc: 0.01 - ETA: 0s - loss: 15.9569 - acc: 0.01 - ETA: 0s - loss: 15.9569 - acc: 0.01 - ETA: 0s - loss: 15.9569 - acc: 0.01 - ETA: 0s - loss: 15.9799 - acc: 0.00 - ETA: 0s - loss: 15.9941 - acc: 0.00 - ETA: 0s - loss: 15.9673 - acc: 0.00 - ETA: 0s - loss: 15.9838 - acc: 0.00 - ETA: 0s - loss: 15.9962 - acc: 0.00 - ETA: 0s - loss: 15.9849 - acc: 0.00 - ETA: 0s - loss: 15.9854 - acc: 0.00 - ETA: 0s - loss: 15.9943 - acc: 0.00 - ETA: 0s - loss: 15.9992 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 116us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00006: val_loss did not improve Epoch 7/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0106 - acc: 0.0067 - ETA: 0s - loss: 16.0448 - acc: 0.00 - ETA: 0s - loss: 16.0274 - acc: 0.00 - ETA: 0s - loss: 15.9953 - acc: 0.00 - ETA: 0s - loss: 16.0003 - acc: 0.00 - ETA: 0s - loss: 15.9933 - acc: 0.00 - ETA: 0s - loss: 15.9793 - acc: 0.00 - ETA: 0s - loss: 15.9962 - acc: 0.00 - ETA: 0s - loss: 15.9955 - acc: 0.00 - ETA: 0s - loss: 15.9948 - acc: 0.00 - ETA: 0s - loss: 16.0030 - acc: 0.00 - ETA: 0s - loss: 15.9992 - acc: 0.00 - ETA: 0s - loss: 16.0058 - acc: 0.00 - 1s 116us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00007: val_loss did not improve Epoch 8/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 15.9301 - acc: 0.0117 - ETA: 0s - loss: 16.0155 - acc: 0.00 - ETA: 0s - loss: 16.0073 - acc: 0.00 - ETA: 0s - loss: 15.9953 - acc: 0.00 - ETA: 0s - loss: 15.9941 - acc: 0.00 - ETA: 0s - loss: 16.0089 - acc: 0.00 - ETA: 0s - loss: 16.0196 - acc: 0.00 - ETA: 0s - loss: 16.0159 - acc: 0.00 - ETA: 0s - loss: 16.0025 - acc: 0.00 - ETA: 0s - loss: 16.0106 - acc: 0.00 - ETA: 0s - loss: 16.0087 - acc: 0.00 - ETA: 0s - loss: 15.9992 - acc: 0.00 - ETA: 0s - loss: 16.0058 - acc: 0.00 - 1s 118us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 0s - loss: 15.9569 - acc: 0.01 - ETA: 0s - loss: 16.0644 - acc: 0.00 - ETA: 0s - loss: 16.0009 - acc: 0.00 - ETA: 0s - loss: 15.9972 - acc: 0.00 - ETA: 0s - loss: 16.0106 - acc: 0.00 - ETA: 0s - loss: 16.0003 - acc: 0.00 - ETA: 0s - loss: 15.9985 - acc: 0.00 - ETA: 0s - loss: 16.0062 - acc: 0.00 - ETA: 0s - loss: 16.0080 - acc: 0.00 - ETA: 0s - loss: 16.0130 - acc: 0.00 - ETA: 0s - loss: 16.0138 - acc: 0.00 - ETA: 0s - loss: 16.0030 - acc: 0.00 - ETA: 0s - loss: 16.0018 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 116us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0106 - acc: 0.0067 - ETA: 0s - loss: 16.0009 - acc: 0.00 - ETA: 0s - loss: 15.9972 - acc: 0.00 - ETA: 0s - loss: 15.9876 - acc: 0.00 - ETA: 0s - loss: 15.9817 - acc: 0.00 - ETA: 0s - loss: 15.9829 - acc: 0.00 - ETA: 0s - loss: 15.9883 - acc: 0.00 - ETA: 0s - loss: 15.9923 - acc: 0.00 - ETA: 0s - loss: 15.9849 - acc: 0.00 - ETA: 0s - loss: 15.9917 - acc: 0.00 - ETA: 0s - loss: 16.0030 - acc: 0.00 - ETA: 0s - loss: 16.0018 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 115us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0106 - acc: 0.0067 - ETA: 0s - loss: 16.0302 - acc: 0.00 - ETA: 0s - loss: 16.0274 - acc: 0.00 - ETA: 0s - loss: 16.0337 - acc: 0.00 - ETA: 0s - loss: 16.0437 - acc: 0.00 - ETA: 0s - loss: 16.0453 - acc: 0.00 - ETA: 0s - loss: 16.0420 - acc: 0.00 - ETA: 0s - loss: 16.0434 - acc: 0.00 - ETA: 0s - loss: 16.0270 - acc: 0.00 - ETA: 0s - loss: 16.0170 - acc: 0.00 - ETA: 0s - loss: 16.0145 - acc: 0.00 - ETA: 0s - loss: 16.0124 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 115us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0644 - acc: 0.0033 - ETA: 0s - loss: 16.0448 - acc: 0.00 - ETA: 0s - loss: 16.0476 - acc: 0.00 - ETA: 0s - loss: 16.0413 - acc: 0.00 - ETA: 0s - loss: 16.0437 - acc: 0.00 - ETA: 0s - loss: 16.0245 - acc: 0.00 - ETA: 0s - loss: 16.0062 - acc: 0.00 - ETA: 0s - loss: 15.9923 - acc: 0.00 - ETA: 0s - loss: 15.9955 - acc: 0.00 - ETA: 0s - loss: 15.9948 - acc: 0.00 - ETA: 0s - loss: 16.0001 - acc: 0.00 - ETA: 0s - loss: 15.9992 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 115us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0106 - acc: 0.0067 - ETA: 0s - loss: 15.9716 - acc: 0.00 - ETA: 0s - loss: 15.9871 - acc: 0.00 - ETA: 0s - loss: 15.9876 - acc: 0.00 - ETA: 0s - loss: 16.0003 - acc: 0.00 - ETA: 0s - loss: 16.0089 - acc: 0.00 - ETA: 0s - loss: 16.0017 - acc: 0.00 - ETA: 0s - loss: 15.9962 - acc: 0.00 - ETA: 0s - loss: 15.9990 - acc: 0.00 - ETA: 0s - loss: 16.0012 - acc: 0.00 - ETA: 0s - loss: 16.0058 - acc: 0.00 - ETA: 0s - loss: 16.0071 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 117us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0302 - acc: 0.0055 - ETA: 0s - loss: 15.9972 - acc: 0.00 - ETA: 0s - loss: 16.0106 - acc: 0.00 - ETA: 0s - loss: 16.0127 - acc: 0.00 - ETA: 0s - loss: 15.9985 - acc: 0.00 - ETA: 0s - loss: 15.9883 - acc: 0.00 - ETA: 0s - loss: 15.9962 - acc: 0.00 - ETA: 0s - loss: 15.9990 - acc: 0.00 - ETA: 0s - loss: 15.9980 - acc: 0.00 - ETA: 0s - loss: 16.0058 - acc: 0.00 - ETA: 0s - loss: 15.9992 - acc: 0.00 - ETA: 0s - loss: 16.0058 - acc: 0.00 - 1s 115us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 0s - loss: 15.6346 - acc: 0.03 - ETA: 0s - loss: 15.8495 - acc: 0.01 - ETA: 0s - loss: 15.8983 - acc: 0.01 - ETA: 0s - loss: 15.9368 - acc: 0.01 - ETA: 0s - loss: 15.9492 - acc: 0.01 - ETA: 0s - loss: 15.9383 - acc: 0.01 - ETA: 0s - loss: 15.9569 - acc: 0.01 - ETA: 0s - loss: 15.9703 - acc: 0.00 - ETA: 0s - loss: 15.9844 - acc: 0.00 - ETA: 0s - loss: 15.9990 - acc: 0.00 - ETA: 0s - loss: 15.9980 - acc: 0.00 - ETA: 0s - loss: 15.9972 - acc: 0.00 - ETA: 0s - loss: 16.0045 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 117us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0106 - acc: 0.0067 - ETA: 0s - loss: 16.0448 - acc: 0.00 - ETA: 0s - loss: 16.0174 - acc: 0.00 - ETA: 0s - loss: 16.0183 - acc: 0.00 - ETA: 0s - loss: 16.0003 - acc: 0.00 - ETA: 0s - loss: 15.9933 - acc: 0.00 - ETA: 0s - loss: 16.0106 - acc: 0.00 - ETA: 0s - loss: 16.0159 - acc: 0.00 - ETA: 0s - loss: 16.0165 - acc: 0.00 - ETA: 0s - loss: 16.0138 - acc: 0.00 - ETA: 0s - loss: 16.0058 - acc: 0.00 - ETA: 0s - loss: 15.9992 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 115us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0644 - acc: 0.0033 - ETA: 0s - loss: 16.0741 - acc: 0.00 - ETA: 0s - loss: 16.0274 - acc: 0.00 - ETA: 0s - loss: 16.0260 - acc: 0.00 - ETA: 0s - loss: 15.9941 - acc: 0.00 - ETA: 0s - loss: 15.9933 - acc: 0.00 - ETA: 0s - loss: 16.0017 - acc: 0.00 - ETA: 0s - loss: 16.0041 - acc: 0.00 - ETA: 0s - loss: 16.0025 - acc: 0.00 - ETA: 0s - loss: 15.9980 - acc: 0.00 - ETA: 0s - loss: 16.0030 - acc: 0.00 - ETA: 0s - loss: 16.0018 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 117us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0644 - acc: 0.0033 - ETA: 0s - loss: 16.0448 - acc: 0.00 - ETA: 0s - loss: 16.0375 - acc: 0.00 - ETA: 0s - loss: 16.0413 - acc: 0.00 - ETA: 0s - loss: 16.0375 - acc: 0.00 - ETA: 0s - loss: 16.0297 - acc: 0.00 - ETA: 0s - loss: 16.0196 - acc: 0.00 - ETA: 0s - loss: 16.0120 - acc: 0.00 - ETA: 0s - loss: 16.0060 - acc: 0.00 - ETA: 0s - loss: 16.0043 - acc: 0.00 - ETA: 0s - loss: 16.0001 - acc: 0.00 - ETA: 0s - loss: 16.0045 - acc: 0.00 - ETA: 0s - loss: 16.0106 - acc: 0.00 - 1s 117us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0912 - acc: 0.0017 - ETA: 0s - loss: 16.0595 - acc: 0.00 - ETA: 0s - loss: 16.0476 - acc: 0.00 - ETA: 0s - loss: 16.0567 - acc: 0.00 - ETA: 0s - loss: 16.0499 - acc: 0.00 - ETA: 0s - loss: 16.0453 - acc: 0.00 - ETA: 0s - loss: 16.0196 - acc: 0.00 - ETA: 0s - loss: 16.0277 - acc: 0.00 - ETA: 0s - loss: 16.0270 - acc: 0.00 - ETA: 0s - loss: 16.0201 - acc: 0.00 - ETA: 0s - loss: 16.0174 - acc: 0.00 - ETA: 0s - loss: 16.0098 - acc: 0.00 - ETA: 0s - loss: 16.0058 - acc: 0.00 - 1s 116us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 0s - loss: 16.1181 - acc: 0.0000e+ - ETA: 0s - loss: 16.0375 - acc: 0.0050 - ETA: 0s - loss: 16.0155 - acc: 0.00 - ETA: 0s - loss: 15.9871 - acc: 0.00 - ETA: 0s - loss: 15.9723 - acc: 0.00 - ETA: 0s - loss: 15.9879 - acc: 0.00 - ETA: 0s - loss: 16.0037 - acc: 0.00 - ETA: 0s - loss: 16.0106 - acc: 0.00 - ETA: 0s - loss: 16.0159 - acc: 0.00 - ETA: 0s - loss: 16.0095 - acc: 0.00 - ETA: 0s - loss: 16.0138 - acc: 0.00 - ETA: 0s - loss: 16.0116 - acc: 0.00 - ETA: 0s - loss: 16.0124 - acc: 0.00 - ETA: 0s - loss: 16.0033 - acc: 0.00 - 1s 115us/step - loss: 16.0047 - acc: 0.0070 - val_loss: 16.0023 - val_acc: 0.0072 Epoch 00020: val_loss did not improve we are at VGG19_model4 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 38s - loss: 16.0392 - acc: 0.0000e+0 - ETA: 9s - loss: 15.8535 - acc: 0.0100 - ETA: 5s - loss: 15.8216 - acc: 0.01 - ETA: 4s - loss: 15.7920 - acc: 0.01 - ETA: 3s - loss: 15.8377 - acc: 0.01 - ETA: 2s - loss: 15.8177 - acc: 0.01 - ETA: 2s - loss: 15.8325 - acc: 0.01 - ETA: 1s - loss: 15.8091 - acc: 0.01 - ETA: 1s - loss: 15.7791 - acc: 0.01 - ETA: 1s - loss: 15.7247 - acc: 0.02 - ETA: 1s - loss: 15.7150 - acc: 0.02 - ETA: 1s - loss: 15.7012 - acc: 0.02 - ETA: 0s - loss: 15.6922 - acc: 0.02 - ETA: 0s - loss: 15.7042 - acc: 0.02 - ETA: 0s - loss: 15.7080 - acc: 0.02 - ETA: 0s - loss: 15.7137 - acc: 0.02 - ETA: 0s - loss: 15.7143 - acc: 0.02 - ETA: 0s - loss: 15.6854 - acc: 0.02 - ETA: 0s - loss: 15.6870 - acc: 0.02 - ETA: 0s - loss: 15.6732 - acc: 0.02 - ETA: 0s - loss: 15.6442 - acc: 0.02 - ETA: 0s - loss: 15.6422 - acc: 0.02 - 2s 278us/step - loss: 15.6349 - acc: 0.0272 - val_loss: 15.3418 - val_acc: 0.0443 Epoch 00001: val_loss improved from inf to 15.34177, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 1s - loss: 14.8287 - acc: 0.08 - ETA: 1s - loss: 15.4636 - acc: 0.04 - ETA: 1s - loss: 15.4678 - acc: 0.04 - ETA: 1s - loss: 15.4425 - acc: 0.04 - ETA: 0s - loss: 15.4019 - acc: 0.04 - ETA: 0s - loss: 15.4153 - acc: 0.04 - ETA: 0s - loss: 15.3606 - acc: 0.04 - ETA: 0s - loss: 15.3469 - acc: 0.04 - ETA: 0s - loss: 15.3042 - acc: 0.04 - ETA: 0s - loss: 15.2877 - acc: 0.05 - ETA: 0s - loss: 15.2368 - acc: 0.05 - ETA: 0s - loss: 15.2625 - acc: 0.05 - ETA: 0s - loss: 15.2534 - acc: 0.05 - ETA: 0s - loss: 15.2386 - acc: 0.05 - ETA: 0s - loss: 15.2446 - acc: 0.05 - ETA: 0s - loss: 15.2577 - acc: 0.05 - ETA: 0s - loss: 15.2479 - acc: 0.05 - ETA: 0s - loss: 15.2702 - acc: 0.05 - ETA: 0s - loss: 15.2754 - acc: 0.05 - ETA: 0s - loss: 15.2742 - acc: 0.05 - ETA: 0s - loss: 15.2844 - acc: 0.05 - ETA: 0s - loss: 15.2766 - acc: 0.05 - 1s 186us/step - loss: 15.2708 - acc: 0.0513 - val_loss: 15.2669 - val_acc: 0.0527 Epoch 00002: val_loss improved from 15.34177 to 15.26686, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 1s - loss: 15.4734 - acc: 0.04 - ETA: 1s - loss: 15.2625 - acc: 0.05 - ETA: 1s - loss: 15.2838 - acc: 0.05 - ETA: 0s - loss: 15.1956 - acc: 0.05 - ETA: 0s - loss: 15.1853 - acc: 0.05 - ETA: 0s - loss: 15.2293 - acc: 0.05 - ETA: 0s - loss: 15.2853 - acc: 0.05 - ETA: 0s - loss: 15.2523 - acc: 0.05 - ETA: 0s - loss: 15.2628 - acc: 0.05 - ETA: 0s - loss: 15.2414 - acc: 0.05 - ETA: 0s - loss: 15.2332 - acc: 0.05 - ETA: 0s - loss: 15.2127 - acc: 0.05 - ETA: 0s - loss: 15.1946 - acc: 0.05 - ETA: 0s - loss: 15.2155 - acc: 0.05 - ETA: 0s - loss: 15.2150 - acc: 0.05 - ETA: 0s - loss: 15.2032 - acc: 0.05 - ETA: 0s - loss: 15.1934 - acc: 0.05 - ETA: 0s - loss: 15.2189 - acc: 0.05 - ETA: 0s - loss: 15.2250 - acc: 0.05 - ETA: 0s - loss: 15.2433 - acc: 0.05 - ETA: 0s - loss: 15.2283 - acc: 0.05 - ETA: 0s - loss: 15.2322 - acc: 0.05 - 1s 184us/step - loss: 15.2162 - acc: 0.0548 - val_loss: 15.1853 - val_acc: 0.0551 Epoch 00003: val_loss improved from 15.26686 to 15.18532, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 1s - loss: 15.3122 - acc: 0.05 - ETA: 1s - loss: 15.3928 - acc: 0.04 - ETA: 1s - loss: 15.2892 - acc: 0.05 - ETA: 0s - loss: 15.3122 - acc: 0.05 - ETA: 0s - loss: 15.3246 - acc: 0.04 - ETA: 0s - loss: 15.2962 - acc: 0.05 - ETA: 0s - loss: 15.2999 - acc: 0.05 - ETA: 0s - loss: 15.2723 - acc: 0.05 - ETA: 0s - loss: 15.2537 - acc: 0.05 - ETA: 0s - loss: 15.2374 - acc: 0.05 - ETA: 0s - loss: 15.2344 - acc: 0.05 - ETA: 0s - loss: 15.2645 - acc: 0.05 - ETA: 0s - loss: 15.2422 - acc: 0.05 - ETA: 0s - loss: 15.2313 - acc: 0.05 - ETA: 0s - loss: 15.2112 - acc: 0.05 - ETA: 0s - loss: 15.2108 - acc: 0.05 - ETA: 0s - loss: 15.2233 - acc: 0.05 - ETA: 0s - loss: 15.2253 - acc: 0.05 - ETA: 0s - loss: 15.2185 - acc: 0.05 - ETA: 0s - loss: 15.2008 - acc: 0.05 - ETA: 0s - loss: 15.1906 - acc: 0.05 - ETA: 0s - loss: 15.1838 - acc: 0.05 - 1s 186us/step - loss: 15.1800 - acc: 0.0575 - val_loss: 15.1562 - val_acc: 0.0587 Epoch 00004: val_loss improved from 15.18532 to 15.15621, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 1s - loss: 14.8286 - acc: 0.08 - ETA: 1s - loss: 15.1791 - acc: 0.05 - ETA: 1s - loss: 15.1901 - acc: 0.05 - ETA: 0s - loss: 15.1947 - acc: 0.05 - ETA: 0s - loss: 15.1712 - acc: 0.05 - ETA: 0s - loss: 15.1775 - acc: 0.05 - ETA: 0s - loss: 15.1563 - acc: 0.05 - ETA: 0s - loss: 15.1995 - acc: 0.05 - ETA: 0s - loss: 15.2240 - acc: 0.05 - ETA: 0s - loss: 15.2334 - acc: 0.05 - ETA: 0s - loss: 15.2098 - acc: 0.05 - ETA: 0s - loss: 15.2284 - acc: 0.05 - ETA: 0s - loss: 15.2437 - acc: 0.05 - ETA: 0s - loss: 15.2495 - acc: 0.05 - ETA: 0s - loss: 15.2239 - acc: 0.05 - ETA: 0s - loss: 15.2261 - acc: 0.05 - ETA: 0s - loss: 15.1952 - acc: 0.05 - ETA: 0s - loss: 15.1927 - acc: 0.05 - ETA: 0s - loss: 15.1875 - acc: 0.05 - ETA: 0s - loss: 15.1606 - acc: 0.05 - ETA: 0s - loss: 15.1760 - acc: 0.05 - ETA: 0s - loss: 15.1521 - acc: 0.05 - 1s 188us/step - loss: 15.1419 - acc: 0.0602 - val_loss: 15.1944 - val_acc: 0.0563 Epoch 00005: val_loss did not improve Epoch 6/20 6680/6680 [==============================] - ETA: 1s - loss: 14.9898 - acc: 0.07 - ETA: 1s - loss: 15.3122 - acc: 0.05 - ETA: 1s - loss: 15.1971 - acc: 0.05 - ETA: 1s - loss: 15.1510 - acc: 0.06 - ETA: 0s - loss: 15.1758 - acc: 0.05 - ETA: 0s - loss: 15.2014 - acc: 0.05 - ETA: 0s - loss: 15.2104 - acc: 0.05 - ETA: 0s - loss: 15.2023 - acc: 0.05 - ETA: 0s - loss: 15.2413 - acc: 0.05 - ETA: 0s - loss: 15.2086 - acc: 0.05 - ETA: 0s - loss: 15.2134 - acc: 0.05 - ETA: 0s - loss: 15.2079 - acc: 0.05 - ETA: 0s - loss: 15.2120 - acc: 0.05 - ETA: 0s - loss: 15.2276 - acc: 0.05 - ETA: 0s - loss: 15.2110 - acc: 0.05 - ETA: 0s - loss: 15.1755 - acc: 0.05 - ETA: 0s - loss: 15.1840 - acc: 0.05 - ETA: 0s - loss: 15.1852 - acc: 0.05 - ETA: 0s - loss: 15.1804 - acc: 0.05 - ETA: 0s - loss: 15.1622 - acc: 0.05 - ETA: 0s - loss: 15.1618 - acc: 0.05 - ETA: 0s - loss: 15.1386 - acc: 0.06 - 1s 186us/step - loss: 15.1314 - acc: 0.0612 - val_loss: 15.1718 - val_acc: 0.0587 Epoch 00006: val_loss did not improve Epoch 7/20 6680/6680 [==============================] - ETA: 1s - loss: 14.1839 - acc: 0.12 - ETA: 1s - loss: 14.5749 - acc: 0.09 - ETA: 1s - loss: 14.7757 - acc: 0.08 - ETA: 0s - loss: 14.8722 - acc: 0.07 - ETA: 0s - loss: 14.8621 - acc: 0.07 - ETA: 0s - loss: 14.8962 - acc: 0.07 - ETA: 0s - loss: 14.9364 - acc: 0.07 - ETA: 0s - loss: 14.9876 - acc: 0.07 - ETA: 0s - loss: 15.0008 - acc: 0.06 - ETA: 0s - loss: 15.0284 - acc: 0.06 - ETA: 0s - loss: 15.0299 - acc: 0.06 - ETA: 0s - loss: 15.0358 - acc: 0.06 - ETA: 0s - loss: 15.0495 - acc: 0.06 - ETA: 0s - loss: 15.0987 - acc: 0.06 - ETA: 0s - loss: 15.1169 - acc: 0.06 - ETA: 0s - loss: 15.1191 - acc: 0.06 - ETA: 0s - loss: 15.1070 - acc: 0.06 - ETA: 0s - loss: 15.1186 - acc: 0.06 - ETA: 0s - loss: 15.1269 - acc: 0.06 - ETA: 0s - loss: 15.1221 - acc: 0.06 - ETA: 0s - loss: 15.1365 - acc: 0.06 - ETA: 0s - loss: 15.1273 - acc: 0.06 - 1s 185us/step - loss: 15.1371 - acc: 0.0606 - val_loss: 15.1830 - val_acc: 0.0563 Epoch 00007: val_loss did not improve Epoch 8/20 6680/6680 [==============================] - ETA: 1s - loss: 15.3122 - acc: 0.05 - ETA: 1s - loss: 15.0704 - acc: 0.06 - ETA: 1s - loss: 14.9208 - acc: 0.07 - ETA: 0s - loss: 15.0704 - acc: 0.06 - ETA: 0s - loss: 15.0310 - acc: 0.06 - ETA: 0s - loss: 15.0427 - acc: 0.06 - ETA: 0s - loss: 15.0025 - acc: 0.06 - ETA: 0s - loss: 15.0289 - acc: 0.06 - ETA: 0s - loss: 15.0058 - acc: 0.06 - ETA: 0s - loss: 15.0431 - acc: 0.06 - ETA: 0s - loss: 15.0733 - acc: 0.06 - ETA: 0s - loss: 15.0800 - acc: 0.06 - ETA: 0s - loss: 15.1068 - acc: 0.06 - ETA: 0s - loss: 15.1022 - acc: 0.06 - ETA: 0s - loss: 15.1165 - acc: 0.06 - ETA: 0s - loss: 15.1461 - acc: 0.06 - ETA: 0s - loss: 15.1368 - acc: 0.06 - ETA: 0s - loss: 15.1376 - acc: 0.06 - ETA: 0s - loss: 15.1412 - acc: 0.06 - ETA: 0s - loss: 15.1382 - acc: 0.06 - ETA: 0s - loss: 15.1302 - acc: 0.06 - ETA: 0s - loss: 15.1336 - acc: 0.06 - 1s 185us/step - loss: 15.1312 - acc: 0.0611 - val_loss: 15.2205 - val_acc: 0.0551 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 1s - loss: 15.4734 - acc: 0.04 - ETA: 1s - loss: 15.2316 - acc: 0.05 - ETA: 1s - loss: 15.1280 - acc: 0.06 - ETA: 0s - loss: 15.0384 - acc: 0.06 - ETA: 0s - loss: 15.0896 - acc: 0.06 - ETA: 0s - loss: 15.1212 - acc: 0.06 - ETA: 0s - loss: 15.0750 - acc: 0.06 - ETA: 0s - loss: 15.0714 - acc: 0.06 - ETA: 0s - loss: 15.0681 - acc: 0.06 - ETA: 0s - loss: 15.0712 - acc: 0.06 - ETA: 0s - loss: 15.0893 - acc: 0.06 - ETA: 0s - loss: 15.1137 - acc: 0.06 - ETA: 0s - loss: 15.0950 - acc: 0.06 - ETA: 0s - loss: 15.1032 - acc: 0.06 - ETA: 0s - loss: 15.1103 - acc: 0.06 - ETA: 0s - loss: 15.1173 - acc: 0.06 - ETA: 0s - loss: 15.1194 - acc: 0.06 - ETA: 0s - loss: 15.1305 - acc: 0.06 - ETA: 0s - loss: 15.1199 - acc: 0.06 - ETA: 0s - loss: 15.1132 - acc: 0.06 - ETA: 0s - loss: 15.1214 - acc: 0.06 - ETA: 0s - loss: 15.1203 - acc: 0.06 - 1s 187us/step - loss: 15.1332 - acc: 0.0608 - val_loss: 15.1550 - val_acc: 0.0587 Epoch 00009: val_loss improved from 15.15621 to 15.15499, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 10/20 6680/6680 [==============================] - ETA: 1s - loss: 15.6346 - acc: 0.03 - ETA: 1s - loss: 15.3928 - acc: 0.04 - ETA: 1s - loss: 15.3352 - acc: 0.04 - ETA: 0s - loss: 15.3283 - acc: 0.04 - ETA: 0s - loss: 15.2502 - acc: 0.05 - ETA: 0s - loss: 15.1812 - acc: 0.05 - ETA: 0s - loss: 15.1171 - acc: 0.06 - ETA: 0s - loss: 15.1510 - acc: 0.06 - ETA: 0s - loss: 15.1639 - acc: 0.05 - ETA: 0s - loss: 15.1280 - acc: 0.06 - ETA: 0s - loss: 15.1146 - acc: 0.06 - ETA: 0s - loss: 15.1131 - acc: 0.06 - ETA: 0s - loss: 15.1336 - acc: 0.06 - ETA: 0s - loss: 15.1107 - acc: 0.06 - ETA: 0s - loss: 15.1323 - acc: 0.06 - ETA: 0s - loss: 15.1195 - acc: 0.06 - ETA: 0s - loss: 15.0918 - acc: 0.06 - ETA: 0s - loss: 15.1014 - acc: 0.06 - ETA: 0s - loss: 15.1188 - acc: 0.06 - ETA: 0s - loss: 15.1204 - acc: 0.06 - ETA: 0s - loss: 15.1193 - acc: 0.06 - ETA: 0s - loss: 15.1086 - acc: 0.06 - 1s 184us/step - loss: 15.1195 - acc: 0.0618 - val_loss: 15.1723 - val_acc: 0.0587 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 1s - loss: 14.9898 - acc: 0.07 - ETA: 1s - loss: 15.3122 - acc: 0.05 - ETA: 1s - loss: 15.1971 - acc: 0.05 - ETA: 0s - loss: 15.3122 - acc: 0.05 - ETA: 0s - loss: 15.2626 - acc: 0.05 - ETA: 0s - loss: 15.2215 - acc: 0.05 - ETA: 0s - loss: 15.1256 - acc: 0.06 - ETA: 0s - loss: 15.1510 - acc: 0.06 - ETA: 0s - loss: 15.1768 - acc: 0.05 - ETA: 0s - loss: 15.1671 - acc: 0.05 - ETA: 0s - loss: 15.1500 - acc: 0.06 - ETA: 0s - loss: 15.1738 - acc: 0.05 - ETA: 0s - loss: 15.1719 - acc: 0.05 - ETA: 0s - loss: 15.1502 - acc: 0.06 - ETA: 0s - loss: 15.1653 - acc: 0.05 - ETA: 0s - loss: 15.1643 - acc: 0.05 - ETA: 0s - loss: 15.1438 - acc: 0.06 - ETA: 0s - loss: 15.1349 - acc: 0.06 - ETA: 0s - loss: 15.1211 - acc: 0.06 - ETA: 0s - loss: 15.1310 - acc: 0.06 - ETA: 0s - loss: 15.1241 - acc: 0.06 - ETA: 0s - loss: 15.1329 - acc: 0.06 - 1s 185us/step - loss: 15.1187 - acc: 0.0620 - val_loss: 15.1530 - val_acc: 0.0599 Epoch 00011: val_loss improved from 15.15499 to 15.15296, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 12/20 6680/6680 [==============================] - ETA: 1s - loss: 15.4734 - acc: 0.04 - ETA: 1s - loss: 15.2316 - acc: 0.05 - ETA: 1s - loss: 15.1050 - acc: 0.06 - ETA: 1s - loss: 15.2477 - acc: 0.05 - ETA: 1s - loss: 15.2006 - acc: 0.05 - ETA: 0s - loss: 15.0805 - acc: 0.06 - ETA: 0s - loss: 15.1001 - acc: 0.06 - ETA: 0s - loss: 15.0704 - acc: 0.06 - ETA: 0s - loss: 15.1123 - acc: 0.06 - ETA: 0s - loss: 15.1395 - acc: 0.06 - ETA: 0s - loss: 15.1326 - acc: 0.06 - ETA: 0s - loss: 15.1261 - acc: 0.06 - ETA: 0s - loss: 15.1236 - acc: 0.06 - ETA: 0s - loss: 15.1045 - acc: 0.06 - ETA: 0s - loss: 15.1077 - acc: 0.06 - ETA: 0s - loss: 15.1105 - acc: 0.06 - ETA: 0s - loss: 15.1064 - acc: 0.06 - ETA: 0s - loss: 15.0991 - acc: 0.06 - ETA: 0s - loss: 15.0917 - acc: 0.06 - ETA: 0s - loss: 15.0783 - acc: 0.06 - ETA: 0s - loss: 15.0819 - acc: 0.06 - ETA: 0s - loss: 15.0727 - acc: 0.06 - 1s 193us/step - loss: 15.0839 - acc: 0.0635 - val_loss: 15.0741 - val_acc: 0.0623 Epoch 00012: val_loss improved from 15.15296 to 15.07411, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 13/20 6680/6680 [==============================] - ETA: 1s - loss: 14.5063 - acc: 0.10 - ETA: 1s - loss: 14.9649 - acc: 0.07 - ETA: 1s - loss: 14.8374 - acc: 0.07 - ETA: 0s - loss: 14.9637 - acc: 0.07 - ETA: 0s - loss: 15.1309 - acc: 0.06 - ETA: 0s - loss: 15.1246 - acc: 0.06 - ETA: 0s - loss: 15.0694 - acc: 0.06 - ETA: 0s - loss: 15.0486 - acc: 0.06 - ETA: 0s - loss: 15.0170 - acc: 0.06 - ETA: 0s - loss: 15.0197 - acc: 0.06 - ETA: 0s - loss: 15.0220 - acc: 0.06 - ETA: 0s - loss: 15.0284 - acc: 0.06 - ETA: 0s - loss: 15.0466 - acc: 0.06 - ETA: 0s - loss: 15.0424 - acc: 0.06 - ETA: 0s - loss: 15.0498 - acc: 0.06 - ETA: 0s - loss: 15.0700 - acc: 0.06 - ETA: 0s - loss: 15.0614 - acc: 0.06 - ETA: 0s - loss: 15.0664 - acc: 0.06 - ETA: 0s - loss: 15.0615 - acc: 0.06 - ETA: 0s - loss: 15.0524 - acc: 0.06 - ETA: 0s - loss: 15.0440 - acc: 0.06 - ETA: 0s - loss: 15.0415 - acc: 0.06 - 1s 186us/step - loss: 15.0584 - acc: 0.0656 - val_loss: 15.0264 - val_acc: 0.0671 Epoch 00013: val_loss improved from 15.07411 to 15.02643, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 14/20 6680/6680 [==============================] - ETA: 1s - loss: 15.6346 - acc: 0.03 - ETA: 1s - loss: 14.7666 - acc: 0.08 - ETA: 1s - loss: 14.9643 - acc: 0.07 - ETA: 1s - loss: 14.9397 - acc: 0.07 - ETA: 0s - loss: 14.9513 - acc: 0.07 - ETA: 0s - loss: 14.9689 - acc: 0.07 - ETA: 0s - loss: 14.9666 - acc: 0.07 - ETA: 0s - loss: 15.0180 - acc: 0.06 - ETA: 0s - loss: 15.0395 - acc: 0.06 - ETA: 0s - loss: 15.0268 - acc: 0.06 - ETA: 0s - loss: 15.0385 - acc: 0.06 - ETA: 0s - loss: 15.0617 - acc: 0.06 - ETA: 0s - loss: 15.0695 - acc: 0.06 - ETA: 0s - loss: 15.0428 - acc: 0.06 - ETA: 0s - loss: 15.0209 - acc: 0.06 - ETA: 0s - loss: 15.0257 - acc: 0.06 - ETA: 0s - loss: 15.0204 - acc: 0.06 - ETA: 0s - loss: 15.0143 - acc: 0.06 - ETA: 0s - loss: 15.0101 - acc: 0.06 - ETA: 0s - loss: 15.0119 - acc: 0.06 - ETA: 0s - loss: 15.0264 - acc: 0.06 - ETA: 0s - loss: 15.0446 - acc: 0.06 - 1s 185us/step - loss: 15.0542 - acc: 0.0647 - val_loss: 15.0216 - val_acc: 0.0671 Epoch 00014: val_loss improved from 15.02643 to 15.02161, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 15/20 6680/6680 [==============================] - ETA: 1s - loss: 14.6675 - acc: 0.09 - ETA: 1s - loss: 15.1913 - acc: 0.05 - ETA: 1s - loss: 15.1316 - acc: 0.06 - ETA: 0s - loss: 15.0940 - acc: 0.06 - ETA: 0s - loss: 15.0080 - acc: 0.06 - ETA: 0s - loss: 15.0650 - acc: 0.06 - ETA: 0s - loss: 15.0621 - acc: 0.06 - ETA: 0s - loss: 15.0962 - acc: 0.06 - ETA: 0s - loss: 15.0705 - acc: 0.06 - ETA: 0s - loss: 15.0694 - acc: 0.06 - ETA: 0s - loss: 15.0752 - acc: 0.06 - ETA: 0s - loss: 15.0487 - acc: 0.06 - ETA: 0s - loss: 15.0255 - acc: 0.06 - ETA: 0s - loss: 15.0528 - acc: 0.06 - ETA: 0s - loss: 15.0416 - acc: 0.06 - ETA: 0s - loss: 15.0451 - acc: 0.06 - ETA: 0s - loss: 15.0479 - acc: 0.06 - ETA: 0s - loss: 15.0551 - acc: 0.06 - ETA: 0s - loss: 15.0575 - acc: 0.06 - ETA: 0s - loss: 15.0632 - acc: 0.06 - ETA: 0s - loss: 15.0290 - acc: 0.06 - ETA: 0s - loss: 15.0172 - acc: 0.06 - 1s 184us/step - loss: 15.0183 - acc: 0.0671 - val_loss: 14.9834 - val_acc: 0.0683 Epoch 00015: val_loss improved from 15.02161 to 14.98336, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 16/20 6680/6680 [==============================] - ETA: 1s - loss: 14.9898 - acc: 0.07 - ETA: 1s - loss: 15.1511 - acc: 0.06 - ETA: 1s - loss: 15.1511 - acc: 0.06 - ETA: 0s - loss: 15.1511 - acc: 0.06 - ETA: 0s - loss: 15.1263 - acc: 0.06 - ETA: 0s - loss: 15.0807 - acc: 0.06 - ETA: 0s - loss: 15.0582 - acc: 0.06 - ETA: 0s - loss: 15.0571 - acc: 0.06 - ETA: 0s - loss: 15.0683 - acc: 0.06 - ETA: 0s - loss: 15.1002 - acc: 0.06 - ETA: 0s - loss: 15.0323 - acc: 0.06 - ETA: 0s - loss: 15.0286 - acc: 0.06 - ETA: 0s - loss: 15.0280 - acc: 0.06 - ETA: 0s - loss: 15.0292 - acc: 0.06 - ETA: 0s - loss: 14.9832 - acc: 0.07 - ETA: 0s - loss: 14.9872 - acc: 0.06 - ETA: 0s - loss: 14.9906 - acc: 0.06 - ETA: 0s - loss: 14.9873 - acc: 0.06 - ETA: 0s - loss: 14.9757 - acc: 0.07 - ETA: 0s - loss: 14.9709 - acc: 0.07 - ETA: 0s - loss: 14.9507 - acc: 0.07 - ETA: 0s - loss: 14.9494 - acc: 0.07 - 1s 186us/step - loss: 14.9357 - acc: 0.0731 - val_loss: 14.8903 - val_acc: 0.0754 Epoch 00016: val_loss improved from 14.98336 to 14.89030, saving model to saved_models/weights.best.VGG194.hdf5 Epoch 17/20 6680/6680 [==============================] - ETA: 1s - loss: 15.6346 - acc: 0.03 - ETA: 1s - loss: 14.9898 - acc: 0.07 - ETA: 1s - loss: 15.0589 - acc: 0.06 - ETA: 0s - loss: 15.1153 - acc: 0.06 - ETA: 0s - loss: 15.0739 - acc: 0.06 - ETA: 0s - loss: 15.0078 - acc: 0.06 - ETA: 0s - loss: 14.9577 - acc: 0.07 - ETA: 0s - loss: 15.0133 - acc: 0.06 - ETA: 0s - loss: 14.9619 - acc: 0.07 - ETA: 0s - loss: 14.9591 - acc: 0.07 - ETA: 0s - loss: 14.9881 - acc: 0.06 - ETA: 0s - loss: 14.9551 - acc: 0.07 - ETA: 0s - loss: 14.9492 - acc: 0.07 - ETA: 0s - loss: 14.9321 - acc: 0.07 - ETA: 0s - loss: 14.9363 - acc: 0.07 - ETA: 0s - loss: 14.9293 - acc: 0.07 - ETA: 0s - loss: 14.9264 - acc: 0.07 - ETA: 0s - loss: 14.9332 - acc: 0.07 - ETA: 0s - loss: 14.9245 - acc: 0.07 - ETA: 0s - loss: 14.9112 - acc: 0.07 - ETA: 0s - loss: 14.9135 - acc: 0.07 - ETA: 0s - loss: 14.9120 - acc: 0.07 - 1s 186us/step - loss: 14.8950 - acc: 0.0754 - val_loss: 14.9842 - val_acc: 0.0695 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 1s - loss: 14.8588 - acc: 0.07 - ETA: 1s - loss: 14.9569 - acc: 0.07 - ETA: 1s - loss: 14.6947 - acc: 0.08 - ETA: 0s - loss: 14.7511 - acc: 0.08 - ETA: 0s - loss: 14.8310 - acc: 0.07 - ETA: 0s - loss: 14.8810 - acc: 0.07 - ETA: 0s - loss: 14.8728 - acc: 0.07 - ETA: 0s - loss: 14.8814 - acc: 0.07 - ETA: 0s - loss: 14.8262 - acc: 0.07 - ETA: 0s - loss: 14.8380 - acc: 0.07 - ETA: 0s - loss: 14.8059 - acc: 0.08 - ETA: 0s - loss: 14.8174 - acc: 0.08 - ETA: 0s - loss: 14.8223 - acc: 0.08 - ETA: 0s - loss: 14.8590 - acc: 0.07 - ETA: 0s - loss: 14.8831 - acc: 0.07 - ETA: 0s - loss: 14.8691 - acc: 0.07 - ETA: 0s - loss: 14.8624 - acc: 0.07 - ETA: 0s - loss: 14.8830 - acc: 0.07 - ETA: 0s - loss: 14.8742 - acc: 0.07 - ETA: 0s - loss: 14.8788 - acc: 0.07 - ETA: 0s - loss: 14.8763 - acc: 0.07 - ETA: 0s - loss: 14.8665 - acc: 0.07 - 1s 186us/step - loss: 14.8828 - acc: 0.0762 - val_loss: 14.9102 - val_acc: 0.0731 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 1s - loss: 14.9898 - acc: 0.07 - ETA: 1s - loss: 14.6675 - acc: 0.09 - ETA: 1s - loss: 14.8056 - acc: 0.08 - ETA: 1s - loss: 14.8609 - acc: 0.07 - ETA: 0s - loss: 14.8658 - acc: 0.07 - ETA: 0s - loss: 14.8891 - acc: 0.07 - ETA: 0s - loss: 14.8626 - acc: 0.07 - ETA: 0s - loss: 14.8653 - acc: 0.07 - ETA: 0s - loss: 14.8802 - acc: 0.07 - ETA: 0s - loss: 14.8747 - acc: 0.07 - ETA: 0s - loss: 14.8445 - acc: 0.07 - ETA: 0s - loss: 14.8763 - acc: 0.07 - ETA: 0s - loss: 14.8727 - acc: 0.07 - ETA: 0s - loss: 14.8452 - acc: 0.07 - ETA: 0s - loss: 14.8853 - acc: 0.07 - ETA: 0s - loss: 14.8501 - acc: 0.07 - ETA: 0s - loss: 14.8471 - acc: 0.07 - ETA: 0s - loss: 14.8491 - acc: 0.07 - ETA: 0s - loss: 14.8521 - acc: 0.07 - ETA: 0s - loss: 14.8701 - acc: 0.07 - ETA: 0s - loss: 14.8811 - acc: 0.07 - ETA: 0s - loss: 14.8762 - acc: 0.07 - 1s 185us/step - loss: 14.8807 - acc: 0.0762 - val_loss: 15.0148 - val_acc: 0.0671 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 1s - loss: 15.6346 - acc: 0.03 - ETA: 1s - loss: 14.9497 - acc: 0.07 - ETA: 1s - loss: 14.7597 - acc: 0.08 - ETA: 1s - loss: 14.8294 - acc: 0.08 - ETA: 0s - loss: 14.8416 - acc: 0.07 - ETA: 0s - loss: 14.8392 - acc: 0.07 - ETA: 0s - loss: 14.8630 - acc: 0.07 - ETA: 0s - loss: 14.9022 - acc: 0.07 - ETA: 0s - loss: 14.9192 - acc: 0.07 - ETA: 0s - loss: 14.9613 - acc: 0.07 - ETA: 0s - loss: 14.9277 - acc: 0.07 - ETA: 0s - loss: 14.9569 - acc: 0.07 - ETA: 0s - loss: 14.9290 - acc: 0.07 - ETA: 0s - loss: 14.9014 - acc: 0.07 - ETA: 0s - loss: 14.9123 - acc: 0.07 - ETA: 0s - loss: 14.8896 - acc: 0.07 - ETA: 0s - loss: 14.8727 - acc: 0.07 - ETA: 0s - loss: 14.8887 - acc: 0.07 - ETA: 0s - loss: 14.8796 - acc: 0.07 - ETA: 0s - loss: 14.8796 - acc: 0.07 - ETA: 0s - loss: 14.8823 - acc: 0.07 - ETA: 0s - loss: 14.8748 - acc: 0.07 - 1s 187us/step - loss: 14.8545 - acc: 0.0781 - val_loss: 14.9428 - val_acc: 0.0719 Epoch 00020: val_loss did not improve we are at InceptionV3_model Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 46s - loss: 6.0309 - acc: 0.0000e+ - ETA: 26s - loss: 9.2544 - acc: 0.0750 - ETA: 19s - loss: 10.9083 - acc: 0.070 - ETA: 16s - loss: 11.4142 - acc: 0.085 - ETA: 14s - loss: 11.7716 - acc: 0.096 - ETA: 12s - loss: 12.0194 - acc: 0.101 - ETA: 11s - loss: 12.2981 - acc: 0.100 - ETA: 10s - loss: 12.3590 - acc: 0.112 - ETA: 10s - loss: 12.3835 - acc: 0.123 - ETA: 9s - loss: 12.4166 - acc: 0.132 - ETA: 9s - loss: 12.4049 - acc: 0.14 - ETA: 8s - loss: 12.4420 - acc: 0.14 - ETA: 8s - loss: 12.4407 - acc: 0.15 - ETA: 8s - loss: 12.4562 - acc: 0.15 - ETA: 7s - loss: 12.4148 - acc: 0.15 - ETA: 7s - loss: 12.4640 - acc: 0.16 - ETA: 7s - loss: 12.4482 - acc: 0.16 - ETA: 6s - loss: 12.4354 - acc: 0.16 - ETA: 6s - loss: 12.3354 - acc: 0.17 - ETA: 6s - loss: 12.2786 - acc: 0.18 - ETA: 6s - loss: 12.1334 - acc: 0.19 - ETA: 6s - loss: 12.0613 - acc: 0.19 - ETA: 5s - loss: 12.0313 - acc: 0.20 - ETA: 5s - loss: 11.9973 - acc: 0.20 - ETA: 5s - loss: 11.8818 - acc: 0.21 - ETA: 5s - loss: 11.8937 - acc: 0.21 - ETA: 5s - loss: 11.8717 - acc: 0.21 - ETA: 5s - loss: 11.8766 - acc: 0.21 - ETA: 4s - loss: 11.8612 - acc: 0.21 - ETA: 4s - loss: 11.8204 - acc: 0.22 - ETA: 4s - loss: 11.7927 - acc: 0.22 - ETA: 4s - loss: 11.7468 - acc: 0.22 - ETA: 4s - loss: 11.7415 - acc: 0.22 - ETA: 4s - loss: 11.7510 - acc: 0.22 - ETA: 4s - loss: 11.7380 - acc: 0.22 - ETA: 3s - loss: 11.7037 - acc: 0.23 - ETA: 3s - loss: 11.6781 - acc: 0.23 - ETA: 3s - loss: 11.6543 - acc: 0.23 - ETA: 3s - loss: 11.6192 - acc: 0.23 - ETA: 3s - loss: 11.6022 - acc: 0.23 - ETA: 3s - loss: 11.5529 - acc: 0.24 - ETA: 3s - loss: 11.5100 - acc: 0.24 - ETA: 2s - loss: 11.4866 - acc: 0.24 - ETA: 2s - loss: 11.4405 - acc: 0.25 - ETA: 2s - loss: 11.4122 - acc: 0.25 - ETA: 2s - loss: 11.3965 - acc: 0.25 - ETA: 2s - loss: 11.3751 - acc: 0.25 - ETA: 2s - loss: 11.3530 - acc: 0.25 - ETA: 2s - loss: 11.3190 - acc: 0.26 - ETA: 2s - loss: 11.3072 - acc: 0.26 - ETA: 1s - loss: 11.2873 - acc: 0.26 - ETA: 1s - loss: 11.2805 - acc: 0.26 - ETA: 1s - loss: 11.2530 - acc: 0.26 - ETA: 1s - loss: 11.2375 - acc: 0.26 - ETA: 1s - loss: 11.2295 - acc: 0.26 - ETA: 1s - loss: 11.2273 - acc: 0.26 - ETA: 1s - loss: 11.1869 - acc: 0.27 - ETA: 1s - loss: 11.1525 - acc: 0.27 - ETA: 0s - loss: 11.1574 - acc: 0.27 - ETA: 0s - loss: 11.1470 - acc: 0.27 - ETA: 0s - loss: 11.1256 - acc: 0.27 - ETA: 0s - loss: 11.1103 - acc: 0.27 - ETA: 0s - loss: 11.0955 - acc: 0.28 - ETA: 0s - loss: 11.0972 - acc: 0.28 - ETA: 0s - loss: 11.0979 - acc: 0.28 - ETA: 0s - loss: 11.1047 - acc: 0.28 - 8s 1ms/step - loss: 11.0913 - acc: 0.2819 - val_loss: 10.2596 - val_acc: 0.3485 Epoch 00001: val_loss improved from inf to 10.25960, saving model to saved_models/weights.best.InceptionV3.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 7s - loss: 10.5561 - acc: 0.34 - ETA: 7s - loss: 10.3799 - acc: 0.34 - ETA: 7s - loss: 10.5197 - acc: 0.34 - ETA: 7s - loss: 10.7806 - acc: 0.32 - ETA: 6s - loss: 10.7696 - acc: 0.32 - ETA: 6s - loss: 10.6985 - acc: 0.32 - ETA: 6s - loss: 10.5463 - acc: 0.33 - ETA: 6s - loss: 10.5579 - acc: 0.33 - ETA: 6s - loss: 10.3404 - acc: 0.35 - ETA: 6s - loss: 10.3084 - acc: 0.35 - ETA: 6s - loss: 10.3024 - acc: 0.35 - ETA: 6s - loss: 10.3342 - acc: 0.34 - ETA: 5s - loss: 10.3879 - acc: 0.34 - ETA: 5s - loss: 10.4031 - acc: 0.34 - ETA: 5s - loss: 10.4880 - acc: 0.34 - ETA: 5s - loss: 10.4563 - acc: 0.34 - ETA: 5s - loss: 10.4145 - acc: 0.34 - ETA: 5s - loss: 10.4525 - acc: 0.34 - ETA: 5s - loss: 10.5173 - acc: 0.33 - ETA: 5s - loss: 10.4632 - acc: 0.33 - ETA: 5s - loss: 10.3892 - acc: 0.34 - ETA: 5s - loss: 10.3775 - acc: 0.34 - ETA: 4s - loss: 10.3804 - acc: 0.34 - ETA: 4s - loss: 10.3930 - acc: 0.34 - ETA: 4s - loss: 10.4070 - acc: 0.34 - ETA: 4s - loss: 10.4112 - acc: 0.34 - ETA: 4s - loss: 10.4063 - acc: 0.34 - ETA: 4s - loss: 10.4114 - acc: 0.33 - ETA: 4s - loss: 10.4164 - acc: 0.33 - ETA: 4s - loss: 10.4563 - acc: 0.33 - ETA: 4s - loss: 10.4420 - acc: 0.33 - ETA: 3s - loss: 10.3681 - acc: 0.34 - ETA: 3s - loss: 10.3882 - acc: 0.34 - ETA: 3s - loss: 10.3491 - acc: 0.34 - ETA: 3s - loss: 10.3689 - acc: 0.34 - ETA: 3s - loss: 10.3593 - acc: 0.34 - ETA: 3s - loss: 10.3554 - acc: 0.34 - ETA: 3s - loss: 10.3520 - acc: 0.34 - ETA: 3s - loss: 10.3263 - acc: 0.34 - ETA: 3s - loss: 10.2985 - acc: 0.34 - ETA: 2s - loss: 10.2581 - acc: 0.34 - ETA: 2s - loss: 10.2505 - acc: 0.34 - ETA: 2s - loss: 10.2077 - acc: 0.35 - ETA: 2s - loss: 10.1721 - acc: 0.35 - ETA: 2s - loss: 10.1505 - acc: 0.35 - ETA: 2s - loss: 10.1506 - acc: 0.35 - ETA: 2s - loss: 10.1394 - acc: 0.35 - ETA: 2s - loss: 10.1330 - acc: 0.35 - ETA: 2s - loss: 10.1732 - acc: 0.35 - ETA: 1s - loss: 10.1594 - acc: 0.35 - ETA: 1s - loss: 10.1593 - acc: 0.35 - ETA: 1s - loss: 10.1782 - acc: 0.35 - ETA: 1s - loss: 10.1670 - acc: 0.35 - ETA: 1s - loss: 10.1676 - acc: 0.35 - ETA: 1s - loss: 10.1551 - acc: 0.35 - ETA: 1s - loss: 10.1431 - acc: 0.35 - ETA: 1s - loss: 10.1299 - acc: 0.35 - ETA: 0s - loss: 10.1201 - acc: 0.35 - ETA: 0s - loss: 10.1031 - acc: 0.35 - ETA: 0s - loss: 10.1038 - acc: 0.35 - ETA: 0s - loss: 10.0947 - acc: 0.35 - ETA: 0s - loss: 10.1092 - acc: 0.35 - ETA: 0s - loss: 10.1068 - acc: 0.35 - ETA: 0s - loss: 10.1150 - acc: 0.35 - ETA: 0s - loss: 10.1226 - acc: 0.35 - ETA: 0s - loss: 10.1202 - acc: 0.35 - 8s 1ms/step - loss: 10.1035 - acc: 0.3590 - val_loss: 9.8615 - val_acc: 0.3749 Epoch 00002: val_loss improved from 10.25960 to 9.86151, saving model to saved_models/weights.best.InceptionV3.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 6s - loss: 10.3249 - acc: 0.35 - ETA: 6s - loss: 10.2617 - acc: 0.35 - ETA: 6s - loss: 9.7961 - acc: 0.3867 - ETA: 6s - loss: 9.5332 - acc: 0.402 - ETA: 6s - loss: 9.6837 - acc: 0.394 - ETA: 6s - loss: 9.6547 - acc: 0.396 - ETA: 6s - loss: 9.6779 - acc: 0.395 - ETA: 6s - loss: 9.6750 - acc: 0.396 - ETA: 6s - loss: 9.6026 - acc: 0.400 - ETA: 6s - loss: 9.6916 - acc: 0.394 - ETA: 6s - loss: 9.6950 - acc: 0.391 - ETA: 5s - loss: 9.6313 - acc: 0.395 - ETA: 5s - loss: 9.6735 - acc: 0.393 - ETA: 5s - loss: 9.6135 - acc: 0.395 - ETA: 5s - loss: 9.5767 - acc: 0.398 - ETA: 5s - loss: 9.5354 - acc: 0.400 - ETA: 5s - loss: 9.5478 - acc: 0.400 - ETA: 5s - loss: 9.5986 - acc: 0.396 - ETA: 5s - loss: 9.6225 - acc: 0.394 - ETA: 5s - loss: 9.6227 - acc: 0.394 - ETA: 4s - loss: 9.6257 - acc: 0.394 - ETA: 4s - loss: 9.6206 - acc: 0.395 - ETA: 4s - loss: 9.6055 - acc: 0.395 - ETA: 4s - loss: 9.6476 - acc: 0.393 - ETA: 4s - loss: 9.6838 - acc: 0.391 - ETA: 4s - loss: 9.7205 - acc: 0.389 - ETA: 4s - loss: 9.7360 - acc: 0.387 - ETA: 4s - loss: 9.7144 - acc: 0.388 - ETA: 4s - loss: 9.7185 - acc: 0.388 - ETA: 4s - loss: 9.7442 - acc: 0.387 - ETA: 3s - loss: 9.7161 - acc: 0.387 - ETA: 3s - loss: 9.7709 - acc: 0.384 - ETA: 3s - loss: 9.8019 - acc: 0.382 - ETA: 3s - loss: 9.7770 - acc: 0.383 - ETA: 3s - loss: 9.7855 - acc: 0.383 - ETA: 3s - loss: 9.8004 - acc: 0.381 - ETA: 3s - loss: 9.8275 - acc: 0.380 - ETA: 3s - loss: 9.8135 - acc: 0.381 - ETA: 3s - loss: 9.8176 - acc: 0.381 - ETA: 2s - loss: 9.8179 - acc: 0.381 - ETA: 2s - loss: 9.8523 - acc: 0.379 - ETA: 2s - loss: 9.8441 - acc: 0.379 - ETA: 2s - loss: 9.8268 - acc: 0.380 - ETA: 2s - loss: 9.8182 - acc: 0.380 - ETA: 2s - loss: 9.7935 - acc: 0.382 - ETA: 2s - loss: 9.7951 - acc: 0.382 - ETA: 2s - loss: 9.7579 - acc: 0.384 - ETA: 2s - loss: 9.7404 - acc: 0.385 - ETA: 1s - loss: 9.7259 - acc: 0.386 - ETA: 1s - loss: 9.7392 - acc: 0.385 - ETA: 1s - loss: 9.7284 - acc: 0.386 - ETA: 1s - loss: 9.7149 - acc: 0.387 - ETA: 1s - loss: 9.7078 - acc: 0.388 - ETA: 1s - loss: 9.7092 - acc: 0.388 - ETA: 1s - loss: 9.7129 - acc: 0.388 - ETA: 1s - loss: 9.7316 - acc: 0.387 - ETA: 1s - loss: 9.7125 - acc: 0.388 - ETA: 0s - loss: 9.6989 - acc: 0.389 - ETA: 0s - loss: 9.6899 - acc: 0.389 - ETA: 0s - loss: 9.7088 - acc: 0.388 - ETA: 0s - loss: 9.7065 - acc: 0.388 - ETA: 0s - loss: 9.6879 - acc: 0.389 - ETA: 0s - loss: 9.6810 - acc: 0.389 - ETA: 0s - loss: 9.6864 - acc: 0.389 - ETA: 0s - loss: 9.6766 - acc: 0.390 - ETA: 0s - loss: 9.6651 - acc: 0.390 - 7s 1ms/step - loss: 9.6721 - acc: 0.3904 - val_loss: 9.5869 - val_acc: 0.3928 Epoch 00003: val_loss improved from 9.86151 to 9.58689, saving model to saved_models/weights.best.InceptionV3.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 6s - loss: 10.1163 - acc: 0.37 - ETA: 6s - loss: 9.5907 - acc: 0.4000 - ETA: 6s - loss: 9.8048 - acc: 0.386 - ETA: 6s - loss: 9.7713 - acc: 0.390 - ETA: 6s - loss: 9.6729 - acc: 0.394 - ETA: 6s - loss: 9.6229 - acc: 0.393 - ETA: 6s - loss: 9.6683 - acc: 0.391 - ETA: 6s - loss: 9.7470 - acc: 0.386 - ETA: 6s - loss: 9.6998 - acc: 0.390 - ETA: 6s - loss: 9.7156 - acc: 0.388 - ETA: 5s - loss: 9.7172 - acc: 0.387 - ETA: 5s - loss: 9.8057 - acc: 0.381 - ETA: 5s - loss: 9.7736 - acc: 0.383 - ETA: 5s - loss: 9.7337 - acc: 0.385 - ETA: 5s - loss: 9.7744 - acc: 0.383 - ETA: 5s - loss: 9.7115 - acc: 0.387 - ETA: 5s - loss: 9.6480 - acc: 0.391 - ETA: 5s - loss: 9.6761 - acc: 0.390 - ETA: 5s - loss: 9.6796 - acc: 0.390 - ETA: 4s - loss: 9.6086 - acc: 0.394 - ETA: 4s - loss: 9.6171 - acc: 0.394 - ETA: 4s - loss: 9.5920 - acc: 0.395 - ETA: 4s - loss: 9.6057 - acc: 0.395 - ETA: 4s - loss: 9.5819 - acc: 0.397 - ETA: 4s - loss: 9.6000 - acc: 0.396 - ETA: 4s - loss: 9.6337 - acc: 0.394 - ETA: 4s - loss: 9.6432 - acc: 0.393 - ETA: 4s - loss: 9.6152 - acc: 0.395 - ETA: 4s - loss: 9.5803 - acc: 0.396 - ETA: 3s - loss: 9.6014 - acc: 0.394 - ETA: 3s - loss: 9.5900 - acc: 0.394 - ETA: 3s - loss: 9.6210 - acc: 0.392 - ETA: 3s - loss: 9.6373 - acc: 0.391 - ETA: 3s - loss: 9.6224 - acc: 0.391 - ETA: 3s - loss: 9.6195 - acc: 0.392 - ETA: 3s - loss: 9.5923 - acc: 0.393 - ETA: 3s - loss: 9.5532 - acc: 0.396 - ETA: 3s - loss: 9.5577 - acc: 0.396 - ETA: 2s - loss: 9.5395 - acc: 0.396 - ETA: 2s - loss: 9.5332 - acc: 0.397 - ETA: 2s - loss: 9.5190 - acc: 0.398 - ETA: 2s - loss: 9.5342 - acc: 0.397 - ETA: 2s - loss: 9.5388 - acc: 0.396 - ETA: 2s - loss: 9.5254 - acc: 0.397 - ETA: 2s - loss: 9.5342 - acc: 0.397 - ETA: 2s - loss: 9.5049 - acc: 0.399 - ETA: 2s - loss: 9.5084 - acc: 0.398 - ETA: 1s - loss: 9.5062 - acc: 0.399 - ETA: 1s - loss: 9.4957 - acc: 0.399 - ETA: 1s - loss: 9.5029 - acc: 0.399 - ETA: 1s - loss: 9.5299 - acc: 0.397 - ETA: 1s - loss: 9.5327 - acc: 0.397 - ETA: 1s - loss: 9.5110 - acc: 0.398 - ETA: 1s - loss: 9.5169 - acc: 0.398 - ETA: 1s - loss: 9.5055 - acc: 0.399 - ETA: 1s - loss: 9.4847 - acc: 0.400 - ETA: 1s - loss: 9.4948 - acc: 0.399 - ETA: 0s - loss: 9.5089 - acc: 0.398 - ETA: 0s - loss: 9.4920 - acc: 0.400 - ETA: 0s - loss: 9.4733 - acc: 0.401 - ETA: 0s - loss: 9.4541 - acc: 0.402 - ETA: 0s - loss: 9.4423 - acc: 0.402 - ETA: 0s - loss: 9.4395 - acc: 0.403 - ETA: 0s - loss: 9.4159 - acc: 0.404 - ETA: 0s - loss: 9.4319 - acc: 0.403 - ETA: 0s - loss: 9.4208 - acc: 0.404 - 7s 1ms/step - loss: 9.4257 - acc: 0.4037 - val_loss: 9.5651 - val_acc: 0.3904 Epoch 00004: val_loss improved from 9.58689 to 9.56506, saving model to saved_models/weights.best.InceptionV3.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 6s - loss: 10.4199 - acc: 0.34 - ETA: 6s - loss: 10.4753 - acc: 0.33 - ETA: 6s - loss: 10.0498 - acc: 0.36 - ETA: 6s - loss: 9.7446 - acc: 0.3825 - ETA: 6s - loss: 9.9008 - acc: 0.374 - ETA: 6s - loss: 9.8009 - acc: 0.381 - ETA: 6s - loss: 9.7485 - acc: 0.382 - ETA: 6s - loss: 9.6240 - acc: 0.390 - ETA: 6s - loss: 9.7792 - acc: 0.381 - ETA: 6s - loss: 9.6941 - acc: 0.386 - ETA: 5s - loss: 9.6923 - acc: 0.387 - ETA: 5s - loss: 9.6638 - acc: 0.390 - ETA: 5s - loss: 9.5959 - acc: 0.393 - ETA: 5s - loss: 9.4153 - acc: 0.405 - ETA: 5s - loss: 9.3464 - acc: 0.409 - ETA: 5s - loss: 9.2962 - acc: 0.413 - ETA: 5s - loss: 9.2994 - acc: 0.413 - ETA: 5s - loss: 9.3200 - acc: 0.412 - ETA: 5s - loss: 9.3293 - acc: 0.412 - ETA: 5s - loss: 9.3094 - acc: 0.413 - ETA: 4s - loss: 9.3083 - acc: 0.413 - ETA: 4s - loss: 9.3175 - acc: 0.412 - ETA: 4s - loss: 9.2302 - acc: 0.417 - ETA: 4s - loss: 9.2351 - acc: 0.417 - ETA: 4s - loss: 9.1970 - acc: 0.419 - ETA: 4s - loss: 9.1874 - acc: 0.420 - ETA: 4s - loss: 9.1799 - acc: 0.421 - ETA: 4s - loss: 9.1793 - acc: 0.421 - ETA: 4s - loss: 9.1519 - acc: 0.422 - ETA: 3s - loss: 9.1517 - acc: 0.422 - ETA: 3s - loss: 9.1370 - acc: 0.423 - ETA: 3s - loss: 9.0781 - acc: 0.427 - ETA: 3s - loss: 9.1141 - acc: 0.424 - ETA: 3s - loss: 9.1362 - acc: 0.423 - ETA: 3s - loss: 9.1359 - acc: 0.423 - ETA: 3s - loss: 9.0733 - acc: 0.426 - ETA: 3s - loss: 9.0678 - acc: 0.427 - ETA: 3s - loss: 9.0413 - acc: 0.428 - ETA: 2s - loss: 9.0496 - acc: 0.427 - ETA: 2s - loss: 9.0611 - acc: 0.427 - ETA: 2s - loss: 9.0437 - acc: 0.428 - ETA: 2s - loss: 9.0335 - acc: 0.428 - ETA: 2s - loss: 9.0571 - acc: 0.427 - ETA: 2s - loss: 9.0907 - acc: 0.425 - ETA: 2s - loss: 9.0975 - acc: 0.424 - ETA: 2s - loss: 9.0820 - acc: 0.425 - ETA: 2s - loss: 9.0757 - acc: 0.426 - ETA: 2s - loss: 9.0488 - acc: 0.427 - ETA: 1s - loss: 9.0442 - acc: 0.428 - ETA: 1s - loss: 9.0747 - acc: 0.426 - ETA: 1s - loss: 9.0754 - acc: 0.426 - ETA: 1s - loss: 9.0568 - acc: 0.427 - ETA: 1s - loss: 9.0515 - acc: 0.427 - ETA: 1s - loss: 9.0484 - acc: 0.428 - ETA: 1s - loss: 9.0333 - acc: 0.429 - ETA: 1s - loss: 9.0476 - acc: 0.428 - ETA: 1s - loss: 9.0579 - acc: 0.427 - ETA: 0s - loss: 9.0599 - acc: 0.427 - ETA: 0s - loss: 9.0726 - acc: 0.426 - ETA: 0s - loss: 9.0664 - acc: 0.427 - ETA: 0s - loss: 9.0569 - acc: 0.428 - ETA: 0s - loss: 9.0554 - acc: 0.428 - ETA: 0s - loss: 9.0703 - acc: 0.427 - ETA: 0s - loss: 9.0648 - acc: 0.427 - ETA: 0s - loss: 9.0863 - acc: 0.426 - ETA: 0s - loss: 9.0843 - acc: 0.426 - 7s 1ms/step - loss: 9.0769 - acc: 0.4274 - val_loss: 9.3617 - val_acc: 0.4024 Epoch 00005: val_loss improved from 9.56506 to 9.36171, saving model to saved_models/weights.best.InceptionV3.hdf5 Epoch 6/20 6680/6680 [==============================] - ETA: 7s - loss: 9.5317 - acc: 0.400 - ETA: 6s - loss: 9.7443 - acc: 0.390 - ETA: 6s - loss: 9.3464 - acc: 0.416 - ETA: 6s - loss: 9.1989 - acc: 0.425 - ETA: 6s - loss: 9.0851 - acc: 0.430 - ETA: 6s - loss: 9.0005 - acc: 0.435 - ETA: 6s - loss: 9.0502 - acc: 0.432 - ETA: 6s - loss: 9.1810 - acc: 0.425 - ETA: 6s - loss: 9.0878 - acc: 0.431 - ETA: 6s - loss: 9.1126 - acc: 0.430 - ETA: 5s - loss: 9.1780 - acc: 0.426 - ETA: 5s - loss: 9.0987 - acc: 0.431 - ETA: 5s - loss: 9.0449 - acc: 0.434 - ETA: 5s - loss: 9.0655 - acc: 0.433 - ETA: 5s - loss: 8.9926 - acc: 0.438 - ETA: 5s - loss: 9.0652 - acc: 0.433 - ETA: 5s - loss: 9.0304 - acc: 0.435 - ETA: 5s - loss: 8.9717 - acc: 0.438 - ETA: 5s - loss: 9.0000 - acc: 0.437 - ETA: 5s - loss: 8.9881 - acc: 0.438 - ETA: 4s - loss: 9.0060 - acc: 0.436 - ETA: 4s - loss: 9.0364 - acc: 0.435 - ETA: 4s - loss: 9.0288 - acc: 0.435 - ETA: 4s - loss: 9.0052 - acc: 0.436 - ETA: 4s - loss: 9.0063 - acc: 0.436 - ETA: 4s - loss: 8.9957 - acc: 0.437 - ETA: 4s - loss: 8.9849 - acc: 0.438 - ETA: 4s - loss: 8.9691 - acc: 0.439 - ETA: 4s - loss: 8.9870 - acc: 0.437 - ETA: 3s - loss: 9.0077 - acc: 0.436 - ETA: 3s - loss: 8.9949 - acc: 0.437 - ETA: 3s - loss: 8.9914 - acc: 0.437 - ETA: 3s - loss: 8.9663 - acc: 0.438 - ETA: 3s - loss: 8.9680 - acc: 0.438 - ETA: 3s - loss: 8.9743 - acc: 0.438 - ETA: 3s - loss: 8.9923 - acc: 0.437 - ETA: 3s - loss: 9.0383 - acc: 0.433 - ETA: 3s - loss: 9.0360 - acc: 0.433 - ETA: 3s - loss: 9.0372 - acc: 0.433 - ETA: 2s - loss: 9.0168 - acc: 0.435 - ETA: 2s - loss: 9.0121 - acc: 0.435 - ETA: 2s - loss: 9.0086 - acc: 0.435 - ETA: 2s - loss: 9.0056 - acc: 0.435 - ETA: 2s - loss: 9.0464 - acc: 0.433 - ETA: 2s - loss: 9.0427 - acc: 0.433 - ETA: 2s - loss: 9.0525 - acc: 0.432 - ETA: 2s - loss: 9.0381 - acc: 0.433 - ETA: 2s - loss: 9.0218 - acc: 0.434 - ETA: 1s - loss: 9.0057 - acc: 0.435 - ETA: 1s - loss: 8.9823 - acc: 0.436 - ETA: 1s - loss: 9.0083 - acc: 0.435 - ETA: 1s - loss: 9.0024 - acc: 0.435 - ETA: 1s - loss: 8.9878 - acc: 0.436 - ETA: 1s - loss: 8.9586 - acc: 0.438 - ETA: 1s - loss: 8.9366 - acc: 0.439 - ETA: 1s - loss: 8.9208 - acc: 0.440 - ETA: 1s - loss: 8.9223 - acc: 0.440 - ETA: 0s - loss: 8.9157 - acc: 0.440 - ETA: 0s - loss: 8.9121 - acc: 0.440 - ETA: 0s - loss: 8.9370 - acc: 0.439 - ETA: 0s - loss: 8.9360 - acc: 0.439 - ETA: 0s - loss: 8.9349 - acc: 0.439 - ETA: 0s - loss: 8.9497 - acc: 0.438 - ETA: 0s - loss: 8.9406 - acc: 0.438 - ETA: 0s - loss: 8.9579 - acc: 0.437 - ETA: 0s - loss: 8.9539 - acc: 0.438 - 7s 1ms/step - loss: 8.9624 - acc: 0.4377 - val_loss: 9.0097 - val_acc: 0.4263 Epoch 00006: val_loss improved from 9.36171 to 9.00975, saving model to saved_models/weights.best.InceptionV3.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 8s - loss: 9.3485 - acc: 0.420 - ETA: 7s - loss: 8.9618 - acc: 0.440 - ETA: 7s - loss: 8.7711 - acc: 0.450 - ETA: 7s - loss: 8.9961 - acc: 0.437 - ETA: 6s - loss: 9.0343 - acc: 0.436 - ETA: 6s - loss: 9.1144 - acc: 0.431 - ETA: 6s - loss: 9.2169 - acc: 0.425 - ETA: 6s - loss: 9.1528 - acc: 0.430 - ETA: 6s - loss: 9.2142 - acc: 0.425 - ETA: 6s - loss: 9.2555 - acc: 0.422 - ETA: 6s - loss: 9.2053 - acc: 0.425 - ETA: 6s - loss: 9.1367 - acc: 0.430 - ETA: 5s - loss: 9.0786 - acc: 0.433 - ETA: 5s - loss: 9.1741 - acc: 0.427 - ETA: 5s - loss: 9.1535 - acc: 0.428 - ETA: 5s - loss: 9.1705 - acc: 0.427 - ETA: 5s - loss: 9.1355 - acc: 0.429 - ETA: 5s - loss: 9.0936 - acc: 0.432 - ETA: 5s - loss: 9.0477 - acc: 0.435 - ETA: 5s - loss: 8.9902 - acc: 0.439 - ETA: 4s - loss: 9.0030 - acc: 0.438 - ETA: 4s - loss: 9.0039 - acc: 0.438 - ETA: 4s - loss: 8.9699 - acc: 0.440 - ETA: 4s - loss: 8.9144 - acc: 0.443 - ETA: 4s - loss: 8.9537 - acc: 0.441 - ETA: 4s - loss: 8.9503 - acc: 0.441 - ETA: 4s - loss: 8.9035 - acc: 0.444 - ETA: 4s - loss: 8.8666 - acc: 0.446 - ETA: 4s - loss: 8.9064 - acc: 0.444 - ETA: 4s - loss: 8.8813 - acc: 0.445 - ETA: 3s - loss: 8.8557 - acc: 0.447 - ETA: 3s - loss: 8.8576 - acc: 0.446 - ETA: 3s - loss: 8.8771 - acc: 0.445 - ETA: 3s - loss: 8.8304 - acc: 0.447 - ETA: 3s - loss: 8.8344 - acc: 0.447 - ETA: 3s - loss: 8.8845 - acc: 0.444 - ETA: 3s - loss: 8.8581 - acc: 0.445 - ETA: 3s - loss: 8.8795 - acc: 0.444 - ETA: 3s - loss: 8.8511 - acc: 0.446 - ETA: 2s - loss: 8.8111 - acc: 0.449 - ETA: 2s - loss: 8.8093 - acc: 0.449 - ETA: 2s - loss: 8.8299 - acc: 0.447 - ETA: 2s - loss: 8.8226 - acc: 0.448 - ETA: 2s - loss: 8.8360 - acc: 0.447 - ETA: 2s - loss: 8.8618 - acc: 0.445 - ETA: 2s - loss: 8.8268 - acc: 0.448 - ETA: 2s - loss: 8.8353 - acc: 0.447 - ETA: 2s - loss: 8.8460 - acc: 0.446 - ETA: 1s - loss: 8.8530 - acc: 0.446 - ETA: 1s - loss: 8.8490 - acc: 0.446 - ETA: 1s - loss: 8.8490 - acc: 0.446 - ETA: 1s - loss: 8.8303 - acc: 0.447 - ETA: 1s - loss: 8.8128 - acc: 0.448 - ETA: 1s - loss: 8.7768 - acc: 0.451 - ETA: 1s - loss: 8.7844 - acc: 0.450 - ETA: 1s - loss: 8.7772 - acc: 0.451 - ETA: 1s - loss: 8.8070 - acc: 0.449 - ETA: 0s - loss: 8.8191 - acc: 0.448 - ETA: 0s - loss: 8.8288 - acc: 0.448 - ETA: 0s - loss: 8.8309 - acc: 0.447 - ETA: 0s - loss: 8.8214 - acc: 0.448 - ETA: 0s - loss: 8.8347 - acc: 0.447 - ETA: 0s - loss: 8.8268 - acc: 0.447 - ETA: 0s - loss: 8.8173 - acc: 0.448 - ETA: 0s - loss: 8.8226 - acc: 0.448 - ETA: 0s - loss: 8.8160 - acc: 0.448 - 7s 1ms/step - loss: 8.8102 - acc: 0.4488 - val_loss: 9.1950 - val_acc: 0.4216 Epoch 00007: val_loss did not improve Epoch 8/20 6680/6680 [==============================] - ETA: 7s - loss: 8.5426 - acc: 0.470 - ETA: 7s - loss: 8.5810 - acc: 0.465 - ETA: 6s - loss: 8.5147 - acc: 0.470 - ETA: 6s - loss: 8.4411 - acc: 0.475 - ETA: 6s - loss: 8.4303 - acc: 0.476 - ETA: 6s - loss: 8.3251 - acc: 0.480 - ETA: 6s - loss: 8.3562 - acc: 0.478 - ETA: 6s - loss: 8.3194 - acc: 0.481 - ETA: 6s - loss: 8.3801 - acc: 0.477 - ETA: 6s - loss: 8.3364 - acc: 0.480 - ETA: 5s - loss: 8.3359 - acc: 0.480 - ETA: 5s - loss: 8.3129 - acc: 0.481 - ETA: 5s - loss: 8.3418 - acc: 0.480 - ETA: 5s - loss: 8.3446 - acc: 0.480 - ETA: 5s - loss: 8.4034 - acc: 0.476 - ETA: 5s - loss: 8.5028 - acc: 0.470 - ETA: 5s - loss: 8.5653 - acc: 0.465 - ETA: 5s - loss: 8.6446 - acc: 0.461 - ETA: 5s - loss: 8.6618 - acc: 0.460 - ETA: 5s - loss: 8.6732 - acc: 0.459 - ETA: 4s - loss: 8.6440 - acc: 0.461 - ETA: 4s - loss: 8.6687 - acc: 0.459 - ETA: 4s - loss: 8.6590 - acc: 0.460 - ETA: 4s - loss: 8.6646 - acc: 0.459 - ETA: 4s - loss: 8.6340 - acc: 0.461 - ETA: 4s - loss: 8.6510 - acc: 0.460 - ETA: 4s - loss: 8.6597 - acc: 0.459 - ETA: 4s - loss: 8.6441 - acc: 0.460 - ETA: 4s - loss: 8.6208 - acc: 0.461 - ETA: 3s - loss: 8.6114 - acc: 0.462 - ETA: 3s - loss: 8.6055 - acc: 0.462 - ETA: 3s - loss: 8.6038 - acc: 0.462 - ETA: 3s - loss: 8.6019 - acc: 0.463 - ETA: 3s - loss: 8.5670 - acc: 0.465 - ETA: 3s - loss: 8.6109 - acc: 0.462 - ETA: 3s - loss: 8.6201 - acc: 0.461 - ETA: 3s - loss: 8.6268 - acc: 0.461 - ETA: 3s - loss: 8.6314 - acc: 0.461 - ETA: 2s - loss: 8.6431 - acc: 0.460 - ETA: 2s - loss: 8.6412 - acc: 0.460 - ETA: 2s - loss: 8.6741 - acc: 0.458 - ETA: 2s - loss: 8.6825 - acc: 0.458 - ETA: 2s - loss: 8.6889 - acc: 0.457 - ETA: 2s - loss: 8.7201 - acc: 0.455 - ETA: 2s - loss: 8.7595 - acc: 0.452 - ETA: 2s - loss: 8.7479 - acc: 0.453 - ETA: 2s - loss: 8.7514 - acc: 0.453 - ETA: 2s - loss: 8.7315 - acc: 0.454 - ETA: 1s - loss: 8.7409 - acc: 0.454 - ETA: 1s - loss: 8.7470 - acc: 0.453 - ETA: 1s - loss: 8.7462 - acc: 0.453 - ETA: 1s - loss: 8.7495 - acc: 0.453 - ETA: 1s - loss: 8.7470 - acc: 0.453 - ETA: 1s - loss: 8.7522 - acc: 0.453 - ETA: 1s - loss: 8.7747 - acc: 0.451 - ETA: 1s - loss: 8.7901 - acc: 0.450 - ETA: 1s - loss: 8.8169 - acc: 0.449 - ETA: 0s - loss: 8.8110 - acc: 0.449 - ETA: 0s - loss: 8.8102 - acc: 0.449 - ETA: 0s - loss: 8.8216 - acc: 0.449 - ETA: 0s - loss: 8.8091 - acc: 0.449 - ETA: 0s - loss: 8.8130 - acc: 0.449 - ETA: 0s - loss: 8.7857 - acc: 0.451 - ETA: 0s - loss: 8.7743 - acc: 0.452 - ETA: 0s - loss: 8.7931 - acc: 0.450 - ETA: 0s - loss: 8.7797 - acc: 0.451 - 7s 1ms/step - loss: 8.7877 - acc: 0.4510 - val_loss: 9.0629 - val_acc: 0.4323 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 6s - loss: 10.1544 - acc: 0.37 - ETA: 6s - loss: 9.8962 - acc: 0.3850 - ETA: 6s - loss: 9.2867 - acc: 0.420 - ETA: 6s - loss: 9.1953 - acc: 0.425 - ETA: 6s - loss: 9.0003 - acc: 0.438 - ETA: 6s - loss: 8.8049 - acc: 0.448 - ETA: 6s - loss: 8.7213 - acc: 0.454 - ETA: 6s - loss: 8.6789 - acc: 0.457 - ETA: 6s - loss: 8.7533 - acc: 0.453 - ETA: 6s - loss: 8.7322 - acc: 0.455 - ETA: 5s - loss: 8.6417 - acc: 0.460 - ETA: 5s - loss: 8.6270 - acc: 0.461 - ETA: 5s - loss: 8.7569 - acc: 0.453 - ETA: 5s - loss: 8.7992 - acc: 0.451 - ETA: 5s - loss: 8.8143 - acc: 0.450 - ETA: 5s - loss: 8.8880 - acc: 0.446 - ETA: 5s - loss: 8.9069 - acc: 0.444 - ETA: 5s - loss: 8.8721 - acc: 0.446 - ETA: 5s - loss: 8.9481 - acc: 0.441 - ETA: 5s - loss: 8.8714 - acc: 0.446 - ETA: 4s - loss: 8.8647 - acc: 0.446 - ETA: 4s - loss: 8.9244 - acc: 0.442 - ETA: 4s - loss: 8.9709 - acc: 0.440 - ETA: 4s - loss: 8.9438 - acc: 0.441 - ETA: 4s - loss: 8.8890 - acc: 0.445 - ETA: 4s - loss: 8.8943 - acc: 0.445 - ETA: 4s - loss: 8.9112 - acc: 0.444 - ETA: 4s - loss: 8.8692 - acc: 0.446 - ETA: 4s - loss: 8.8636 - acc: 0.447 - ETA: 3s - loss: 8.8207 - acc: 0.450 - ETA: 3s - loss: 8.7974 - acc: 0.451 - ETA: 3s - loss: 8.8107 - acc: 0.450 - ETA: 3s - loss: 8.8173 - acc: 0.449 - ETA: 3s - loss: 8.8434 - acc: 0.447 - ETA: 3s - loss: 8.8715 - acc: 0.445 - ETA: 3s - loss: 8.8955 - acc: 0.444 - ETA: 3s - loss: 8.8685 - acc: 0.445 - ETA: 3s - loss: 8.8834 - acc: 0.444 - ETA: 2s - loss: 8.9366 - acc: 0.441 - ETA: 2s - loss: 8.9280 - acc: 0.442 - ETA: 2s - loss: 8.9362 - acc: 0.441 - ETA: 2s - loss: 8.9076 - acc: 0.443 - ETA: 2s - loss: 8.8925 - acc: 0.444 - ETA: 2s - loss: 8.8657 - acc: 0.445 - ETA: 2s - loss: 8.8262 - acc: 0.448 - ETA: 2s - loss: 8.8439 - acc: 0.447 - ETA: 2s - loss: 8.8367 - acc: 0.447 - ETA: 2s - loss: 8.8172 - acc: 0.449 - ETA: 1s - loss: 8.7995 - acc: 0.449 - ETA: 1s - loss: 8.7943 - acc: 0.450 - ETA: 1s - loss: 8.7894 - acc: 0.450 - ETA: 1s - loss: 8.7785 - acc: 0.451 - ETA: 1s - loss: 8.7618 - acc: 0.452 - ETA: 1s - loss: 8.7846 - acc: 0.451 - ETA: 1s - loss: 8.7832 - acc: 0.451 - ETA: 1s - loss: 8.7846 - acc: 0.451 - ETA: 1s - loss: 8.7932 - acc: 0.450 - ETA: 0s - loss: 8.7783 - acc: 0.451 - ETA: 0s - loss: 8.7825 - acc: 0.451 - ETA: 0s - loss: 8.7946 - acc: 0.450 - ETA: 0s - loss: 8.7802 - acc: 0.451 - ETA: 0s - loss: 8.7798 - acc: 0.451 - ETA: 0s - loss: 8.7965 - acc: 0.450 - ETA: 0s - loss: 8.7921 - acc: 0.450 - ETA: 0s - loss: 8.7822 - acc: 0.451 - ETA: 0s - loss: 8.7790 - acc: 0.451 - 7s 1ms/step - loss: 8.7828 - acc: 0.4510 - val_loss: 9.1524 - val_acc: 0.4240 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 6s - loss: 10.3160 - acc: 0.36 - ETA: 6s - loss: 9.9944 - acc: 0.3800 - ETA: 6s - loss: 9.6718 - acc: 0.400 - ETA: 6s - loss: 9.3792 - acc: 0.417 - ETA: 6s - loss: 9.0267 - acc: 0.438 - ETA: 6s - loss: 8.9729 - acc: 0.441 - ETA: 6s - loss: 9.0436 - acc: 0.437 - ETA: 6s - loss: 9.0087 - acc: 0.438 - ETA: 6s - loss: 8.8779 - acc: 0.444 - ETA: 6s - loss: 8.7315 - acc: 0.454 - ETA: 5s - loss: 8.7143 - acc: 0.455 - ETA: 5s - loss: 8.6991 - acc: 0.456 - ETA: 5s - loss: 8.6530 - acc: 0.458 - ETA: 5s - loss: 8.6181 - acc: 0.460 - ETA: 5s - loss: 8.7098 - acc: 0.455 - ETA: 5s - loss: 8.6826 - acc: 0.456 - ETA: 5s - loss: 8.7123 - acc: 0.455 - ETA: 5s - loss: 8.6850 - acc: 0.457 - ETA: 5s - loss: 8.7114 - acc: 0.455 - ETA: 5s - loss: 8.7112 - acc: 0.456 - ETA: 4s - loss: 8.7799 - acc: 0.451 - ETA: 4s - loss: 8.7911 - acc: 0.451 - ETA: 4s - loss: 8.7865 - acc: 0.451 - ETA: 4s - loss: 8.7495 - acc: 0.454 - ETA: 4s - loss: 8.7596 - acc: 0.453 - ETA: 4s - loss: 8.7624 - acc: 0.453 - ETA: 4s - loss: 8.7365 - acc: 0.454 - ETA: 4s - loss: 8.7468 - acc: 0.454 - ETA: 4s - loss: 8.7898 - acc: 0.451 - ETA: 3s - loss: 8.7571 - acc: 0.453 - ETA: 3s - loss: 8.7553 - acc: 0.453 - ETA: 3s - loss: 8.7437 - acc: 0.454 - ETA: 3s - loss: 8.7620 - acc: 0.453 - ETA: 3s - loss: 8.7557 - acc: 0.454 - ETA: 3s - loss: 8.7634 - acc: 0.453 - ETA: 3s - loss: 8.7573 - acc: 0.454 - ETA: 3s - loss: 8.7124 - acc: 0.457 - ETA: 3s - loss: 8.6994 - acc: 0.457 - ETA: 2s - loss: 8.7180 - acc: 0.456 - ETA: 2s - loss: 8.7176 - acc: 0.456 - ETA: 2s - loss: 8.6977 - acc: 0.458 - ETA: 2s - loss: 8.6920 - acc: 0.458 - ETA: 2s - loss: 8.6660 - acc: 0.459 - ETA: 2s - loss: 8.6766 - acc: 0.459 - ETA: 2s - loss: 8.6853 - acc: 0.458 - ETA: 2s - loss: 8.7117 - acc: 0.456 - ETA: 2s - loss: 8.7051 - acc: 0.456 - ETA: 2s - loss: 8.7293 - acc: 0.455 - ETA: 1s - loss: 8.7136 - acc: 0.455 - ETA: 1s - loss: 8.6942 - acc: 0.457 - ETA: 1s - loss: 8.7007 - acc: 0.456 - ETA: 1s - loss: 8.7027 - acc: 0.456 - ETA: 1s - loss: 8.7179 - acc: 0.455 - ETA: 1s - loss: 8.7236 - acc: 0.455 - ETA: 1s - loss: 8.6961 - acc: 0.456 - ETA: 1s - loss: 8.7060 - acc: 0.456 - ETA: 1s - loss: 8.7059 - acc: 0.456 - ETA: 0s - loss: 8.6985 - acc: 0.456 - ETA: 0s - loss: 8.6877 - acc: 0.457 - ETA: 0s - loss: 8.7122 - acc: 0.456 - ETA: 0s - loss: 8.7200 - acc: 0.455 - ETA: 0s - loss: 8.7171 - acc: 0.455 - ETA: 0s - loss: 8.7017 - acc: 0.456 - ETA: 0s - loss: 8.7042 - acc: 0.456 - ETA: 0s - loss: 8.7142 - acc: 0.456 - ETA: 0s - loss: 8.7384 - acc: 0.454 - 7s 1ms/step - loss: 8.7375 - acc: 0.4546 - val_loss: 9.2922 - val_acc: 0.4156 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 6s - loss: 9.5097 - acc: 0.410 - ETA: 6s - loss: 8.9457 - acc: 0.445 - ETA: 6s - loss: 8.7576 - acc: 0.456 - ETA: 6s - loss: 8.7039 - acc: 0.460 - ETA: 6s - loss: 8.6719 - acc: 0.462 - ETA: 6s - loss: 8.6717 - acc: 0.461 - ETA: 6s - loss: 8.6072 - acc: 0.465 - ETA: 6s - loss: 8.5791 - acc: 0.467 - ETA: 6s - loss: 8.5929 - acc: 0.466 - ETA: 6s - loss: 8.7332 - acc: 0.458 - ETA: 5s - loss: 8.7824 - acc: 0.454 - ETA: 5s - loss: 8.8832 - acc: 0.448 - ETA: 5s - loss: 8.8570 - acc: 0.450 - ETA: 5s - loss: 8.7425 - acc: 0.457 - ETA: 5s - loss: 8.7614 - acc: 0.456 - ETA: 5s - loss: 8.8686 - acc: 0.449 - ETA: 5s - loss: 8.8968 - acc: 0.447 - ETA: 5s - loss: 8.8532 - acc: 0.450 - ETA: 5s - loss: 8.8045 - acc: 0.452 - ETA: 5s - loss: 8.8398 - acc: 0.450 - ETA: 4s - loss: 8.8428 - acc: 0.450 - ETA: 4s - loss: 8.8585 - acc: 0.449 - ETA: 4s - loss: 8.8283 - acc: 0.450 - ETA: 4s - loss: 8.8096 - acc: 0.451 - ETA: 4s - loss: 8.7895 - acc: 0.452 - ETA: 4s - loss: 8.7676 - acc: 0.454 - ETA: 4s - loss: 8.7294 - acc: 0.456 - ETA: 4s - loss: 8.7073 - acc: 0.457 - ETA: 4s - loss: 8.7406 - acc: 0.455 - ETA: 3s - loss: 8.7877 - acc: 0.453 - ETA: 3s - loss: 8.8060 - acc: 0.451 - ETA: 3s - loss: 8.7457 - acc: 0.455 - ETA: 3s - loss: 8.7298 - acc: 0.456 - ETA: 3s - loss: 8.7527 - acc: 0.455 - ETA: 3s - loss: 8.6915 - acc: 0.459 - ETA: 3s - loss: 8.6936 - acc: 0.458 - ETA: 3s - loss: 8.6745 - acc: 0.460 - ETA: 3s - loss: 8.6371 - acc: 0.462 - ETA: 2s - loss: 8.6430 - acc: 0.462 - ETA: 2s - loss: 8.6523 - acc: 0.461 - ETA: 2s - loss: 8.6654 - acc: 0.460 - ETA: 2s - loss: 8.6816 - acc: 0.459 - ETA: 2s - loss: 8.6746 - acc: 0.460 - ETA: 2s - loss: 8.6570 - acc: 0.461 - ETA: 2s - loss: 8.6717 - acc: 0.460 - ETA: 2s - loss: 8.6479 - acc: 0.462 - ETA: 2s - loss: 8.6354 - acc: 0.462 - ETA: 2s - loss: 8.6122 - acc: 0.464 - ETA: 1s - loss: 8.6292 - acc: 0.462 - ETA: 1s - loss: 8.6629 - acc: 0.460 - ETA: 1s - loss: 8.6607 - acc: 0.461 - ETA: 1s - loss: 8.6892 - acc: 0.458 - ETA: 1s - loss: 8.7097 - acc: 0.457 - ETA: 1s - loss: 8.7120 - acc: 0.457 - ETA: 1s - loss: 8.7060 - acc: 0.457 - ETA: 1s - loss: 8.7221 - acc: 0.456 - ETA: 1s - loss: 8.7154 - acc: 0.457 - ETA: 0s - loss: 8.7265 - acc: 0.456 - ETA: 0s - loss: 8.7343 - acc: 0.455 - ETA: 0s - loss: 8.7282 - acc: 0.456 - ETA: 0s - loss: 8.7332 - acc: 0.455 - ETA: 0s - loss: 8.7490 - acc: 0.454 - ETA: 0s - loss: 8.7481 - acc: 0.454 - ETA: 0s - loss: 8.7373 - acc: 0.455 - ETA: 0s - loss: 8.7368 - acc: 0.455 - ETA: 0s - loss: 8.7461 - acc: 0.454 - 7s 1ms/step - loss: 8.7330 - acc: 0.4557 - val_loss: 9.2270 - val_acc: 0.4204 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 7s - loss: 9.3485 - acc: 0.420 - ETA: 7s - loss: 8.9790 - acc: 0.440 - ETA: 6s - loss: 8.6732 - acc: 0.456 - ETA: 6s - loss: 8.4573 - acc: 0.470 - ETA: 6s - loss: 8.3454 - acc: 0.478 - ETA: 6s - loss: 8.2439 - acc: 0.485 - ETA: 6s - loss: 8.1484 - acc: 0.491 - ETA: 6s - loss: 8.2229 - acc: 0.486 - ETA: 6s - loss: 8.3659 - acc: 0.477 - ETA: 6s - loss: 8.3809 - acc: 0.477 - ETA: 6s - loss: 8.2801 - acc: 0.482 - ETA: 5s - loss: 8.3826 - acc: 0.476 - ETA: 5s - loss: 8.4197 - acc: 0.474 - ETA: 5s - loss: 8.4515 - acc: 0.472 - ETA: 5s - loss: 8.4781 - acc: 0.470 - ETA: 5s - loss: 8.4922 - acc: 0.470 - ETA: 5s - loss: 8.5899 - acc: 0.464 - ETA: 5s - loss: 8.5246 - acc: 0.468 - ETA: 5s - loss: 8.5511 - acc: 0.466 - ETA: 5s - loss: 8.5372 - acc: 0.467 - ETA: 4s - loss: 8.4761 - acc: 0.471 - ETA: 4s - loss: 8.4738 - acc: 0.471 - ETA: 4s - loss: 8.4698 - acc: 0.471 - ETA: 4s - loss: 8.5045 - acc: 0.469 - ETA: 4s - loss: 8.4609 - acc: 0.472 - ETA: 4s - loss: 8.4826 - acc: 0.471 - ETA: 4s - loss: 8.5565 - acc: 0.466 - ETA: 4s - loss: 8.5609 - acc: 0.466 - ETA: 4s - loss: 8.5603 - acc: 0.466 - ETA: 3s - loss: 8.5448 - acc: 0.467 - ETA: 3s - loss: 8.5084 - acc: 0.469 - ETA: 3s - loss: 8.5246 - acc: 0.468 - ETA: 3s - loss: 8.5691 - acc: 0.466 - ETA: 3s - loss: 8.5825 - acc: 0.465 - ETA: 3s - loss: 8.6505 - acc: 0.461 - ETA: 3s - loss: 8.6789 - acc: 0.459 - ETA: 3s - loss: 8.6484 - acc: 0.461 - ETA: 3s - loss: 8.6656 - acc: 0.460 - ETA: 2s - loss: 8.6782 - acc: 0.459 - ETA: 2s - loss: 8.6869 - acc: 0.459 - ETA: 2s - loss: 8.6912 - acc: 0.458 - ETA: 2s - loss: 8.6916 - acc: 0.458 - ETA: 2s - loss: 8.7106 - acc: 0.457 - ETA: 2s - loss: 8.7178 - acc: 0.457 - ETA: 2s - loss: 8.7031 - acc: 0.458 - ETA: 2s - loss: 8.7067 - acc: 0.458 - ETA: 2s - loss: 8.7306 - acc: 0.456 - ETA: 2s - loss: 8.7435 - acc: 0.455 - ETA: 1s - loss: 8.7526 - acc: 0.455 - ETA: 1s - loss: 8.7387 - acc: 0.456 - ETA: 1s - loss: 8.7163 - acc: 0.457 - ETA: 1s - loss: 8.7200 - acc: 0.457 - ETA: 1s - loss: 8.6863 - acc: 0.459 - ETA: 1s - loss: 8.6717 - acc: 0.460 - ETA: 1s - loss: 8.6627 - acc: 0.460 - ETA: 1s - loss: 8.6547 - acc: 0.461 - ETA: 1s - loss: 8.6555 - acc: 0.461 - ETA: 0s - loss: 8.6702 - acc: 0.460 - ETA: 0s - loss: 8.6708 - acc: 0.460 - ETA: 0s - loss: 8.6820 - acc: 0.459 - ETA: 0s - loss: 8.6665 - acc: 0.460 - ETA: 0s - loss: 8.6801 - acc: 0.459 - ETA: 0s - loss: 8.6805 - acc: 0.459 - ETA: 0s - loss: 8.6758 - acc: 0.460 - ETA: 0s - loss: 8.6695 - acc: 0.460 - ETA: 0s - loss: 8.6739 - acc: 0.460 - 7s 1ms/step - loss: 8.6858 - acc: 0.4594 - val_loss: 9.0487 - val_acc: 0.4335 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 6s - loss: 9.6709 - acc: 0.400 - ETA: 6s - loss: 9.4291 - acc: 0.415 - ETA: 6s - loss: 9.1873 - acc: 0.430 - ETA: 6s - loss: 8.8638 - acc: 0.450 - ETA: 6s - loss: 8.8318 - acc: 0.452 - ETA: 6s - loss: 8.9716 - acc: 0.443 - ETA: 6s - loss: 9.0485 - acc: 0.438 - ETA: 6s - loss: 9.0989 - acc: 0.435 - ETA: 6s - loss: 9.0729 - acc: 0.436 - ETA: 6s - loss: 8.9639 - acc: 0.443 - ETA: 5s - loss: 8.9403 - acc: 0.444 - ETA: 5s - loss: 8.9072 - acc: 0.446 - ETA: 5s - loss: 8.9040 - acc: 0.446 - ETA: 5s - loss: 8.9012 - acc: 0.447 - ETA: 5s - loss: 8.8669 - acc: 0.449 - ETA: 5s - loss: 8.8265 - acc: 0.451 - ETA: 5s - loss: 8.7624 - acc: 0.455 - ETA: 5s - loss: 8.8106 - acc: 0.452 - ETA: 5s - loss: 8.8219 - acc: 0.452 - ETA: 5s - loss: 8.8180 - acc: 0.452 - ETA: 4s - loss: 8.8586 - acc: 0.449 - ETA: 4s - loss: 8.8882 - acc: 0.447 - ETA: 4s - loss: 8.9539 - acc: 0.443 - ETA: 4s - loss: 8.9771 - acc: 0.442 - ETA: 4s - loss: 8.9404 - acc: 0.444 - ETA: 4s - loss: 8.9251 - acc: 0.445 - ETA: 4s - loss: 8.8990 - acc: 0.447 - ETA: 4s - loss: 8.8287 - acc: 0.451 - ETA: 4s - loss: 8.8577 - acc: 0.449 - ETA: 3s - loss: 8.8741 - acc: 0.448 - ETA: 3s - loss: 8.7821 - acc: 0.454 - ETA: 3s - loss: 8.7612 - acc: 0.455 - ETA: 3s - loss: 8.7448 - acc: 0.456 - ETA: 3s - loss: 8.7373 - acc: 0.456 - ETA: 3s - loss: 8.7125 - acc: 0.458 - ETA: 3s - loss: 8.7123 - acc: 0.458 - ETA: 3s - loss: 8.7469 - acc: 0.456 - ETA: 3s - loss: 8.7105 - acc: 0.458 - ETA: 2s - loss: 8.7228 - acc: 0.457 - ETA: 2s - loss: 8.6918 - acc: 0.459 - ETA: 2s - loss: 8.6449 - acc: 0.462 - ETA: 2s - loss: 8.6425 - acc: 0.462 - ETA: 2s - loss: 8.6327 - acc: 0.463 - ETA: 2s - loss: 8.6380 - acc: 0.463 - ETA: 2s - loss: 8.6394 - acc: 0.462 - ETA: 2s - loss: 8.6689 - acc: 0.461 - ETA: 2s - loss: 8.6595 - acc: 0.461 - ETA: 2s - loss: 8.6672 - acc: 0.461 - ETA: 1s - loss: 8.6778 - acc: 0.460 - ETA: 1s - loss: 8.6848 - acc: 0.460 - ETA: 1s - loss: 8.6504 - acc: 0.462 - ETA: 1s - loss: 8.6452 - acc: 0.462 - ETA: 1s - loss: 8.6737 - acc: 0.460 - ETA: 1s - loss: 8.6623 - acc: 0.461 - ETA: 1s - loss: 8.6572 - acc: 0.461 - ETA: 1s - loss: 8.6631 - acc: 0.461 - ETA: 1s - loss: 8.6780 - acc: 0.460 - ETA: 0s - loss: 8.6784 - acc: 0.460 - ETA: 0s - loss: 8.6856 - acc: 0.460 - ETA: 0s - loss: 8.6832 - acc: 0.460 - ETA: 0s - loss: 8.6658 - acc: 0.461 - ETA: 0s - loss: 8.6612 - acc: 0.461 - ETA: 0s - loss: 8.6598 - acc: 0.461 - ETA: 0s - loss: 8.6535 - acc: 0.461 - ETA: 0s - loss: 8.6642 - acc: 0.461 - ETA: 0s - loss: 8.6501 - acc: 0.462 - 7s 1ms/step - loss: 8.6624 - acc: 0.4612 - val_loss: 9.1198 - val_acc: 0.4275 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 6s - loss: 8.8477 - acc: 0.450 - ETA: 6s - loss: 8.6951 - acc: 0.460 - ETA: 6s - loss: 8.4842 - acc: 0.473 - ETA: 6s - loss: 8.8212 - acc: 0.452 - ETA: 6s - loss: 8.6043 - acc: 0.466 - ETA: 6s - loss: 8.4597 - acc: 0.475 - ETA: 6s - loss: 8.4716 - acc: 0.474 - ETA: 6s - loss: 8.5812 - acc: 0.467 - ETA: 6s - loss: 8.4694 - acc: 0.474 - ETA: 6s - loss: 8.5412 - acc: 0.470 - ETA: 6s - loss: 8.5413 - acc: 0.470 - ETA: 5s - loss: 8.5280 - acc: 0.470 - ETA: 5s - loss: 8.5505 - acc: 0.469 - ETA: 5s - loss: 8.5845 - acc: 0.467 - ETA: 5s - loss: 8.6140 - acc: 0.465 - ETA: 5s - loss: 8.6397 - acc: 0.463 - ETA: 5s - loss: 8.6814 - acc: 0.461 - ETA: 5s - loss: 8.6827 - acc: 0.461 - ETA: 5s - loss: 8.6838 - acc: 0.461 - ETA: 5s - loss: 8.7365 - acc: 0.457 - ETA: 4s - loss: 8.6812 - acc: 0.461 - ETA: 4s - loss: 8.6529 - acc: 0.462 - ETA: 4s - loss: 8.7042 - acc: 0.459 - ETA: 4s - loss: 8.6907 - acc: 0.460 - ETA: 4s - loss: 8.6719 - acc: 0.461 - ETA: 4s - loss: 8.6669 - acc: 0.461 - ETA: 4s - loss: 8.7342 - acc: 0.457 - ETA: 4s - loss: 8.7216 - acc: 0.458 - ETA: 4s - loss: 8.6988 - acc: 0.459 - ETA: 3s - loss: 8.7043 - acc: 0.459 - ETA: 3s - loss: 8.7407 - acc: 0.457 - ETA: 3s - loss: 8.7244 - acc: 0.458 - ETA: 3s - loss: 8.7315 - acc: 0.457 - ETA: 3s - loss: 8.6739 - acc: 0.461 - ETA: 3s - loss: 8.6701 - acc: 0.461 - ETA: 3s - loss: 8.6415 - acc: 0.463 - ETA: 3s - loss: 8.6171 - acc: 0.464 - ETA: 3s - loss: 8.6066 - acc: 0.465 - ETA: 2s - loss: 8.6298 - acc: 0.463 - ETA: 2s - loss: 8.6437 - acc: 0.463 - ETA: 2s - loss: 8.6766 - acc: 0.461 - ETA: 2s - loss: 8.6961 - acc: 0.459 - ETA: 2s - loss: 8.6969 - acc: 0.459 - ETA: 2s - loss: 8.6860 - acc: 0.460 - ETA: 2s - loss: 8.6677 - acc: 0.461 - ETA: 2s - loss: 8.6580 - acc: 0.461 - ETA: 2s - loss: 8.6555 - acc: 0.461 - ETA: 2s - loss: 8.6700 - acc: 0.461 - ETA: 1s - loss: 8.6871 - acc: 0.460 - ETA: 1s - loss: 8.7003 - acc: 0.459 - ETA: 1s - loss: 8.6720 - acc: 0.461 - ETA: 1s - loss: 8.6781 - acc: 0.460 - ETA: 1s - loss: 8.6635 - acc: 0.461 - ETA: 1s - loss: 8.6882 - acc: 0.459 - ETA: 1s - loss: 8.6551 - acc: 0.461 - ETA: 1s - loss: 8.6502 - acc: 0.462 - ETA: 1s - loss: 8.6766 - acc: 0.460 - ETA: 0s - loss: 8.6645 - acc: 0.461 - ETA: 0s - loss: 8.6761 - acc: 0.460 - ETA: 0s - loss: 8.6685 - acc: 0.460 - ETA: 0s - loss: 8.6623 - acc: 0.461 - ETA: 0s - loss: 8.6473 - acc: 0.461 - ETA: 0s - loss: 8.6523 - acc: 0.461 - ETA: 0s - loss: 8.6732 - acc: 0.460 - ETA: 0s - loss: 8.6690 - acc: 0.460 - ETA: 0s - loss: 8.6647 - acc: 0.460 - 7s 1ms/step - loss: 8.6622 - acc: 0.4609 - val_loss: 9.1780 - val_acc: 0.4240 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 6s - loss: 6.9308 - acc: 0.570 - ETA: 6s - loss: 8.1397 - acc: 0.495 - ETA: 6s - loss: 8.5963 - acc: 0.466 - ETA: 6s - loss: 8.5426 - acc: 0.470 - ETA: 6s - loss: 8.5104 - acc: 0.472 - ETA: 6s - loss: 8.3814 - acc: 0.480 - ETA: 6s - loss: 8.3354 - acc: 0.482 - ETA: 6s - loss: 8.5634 - acc: 0.468 - ETA: 6s - loss: 8.5253 - acc: 0.471 - ETA: 6s - loss: 8.4787 - acc: 0.474 - ETA: 5s - loss: 8.5724 - acc: 0.468 - ETA: 5s - loss: 8.6774 - acc: 0.461 - ETA: 5s - loss: 8.6794 - acc: 0.461 - ETA: 5s - loss: 8.6598 - acc: 0.462 - ETA: 5s - loss: 8.6305 - acc: 0.464 - ETA: 5s - loss: 8.6762 - acc: 0.460 - ETA: 5s - loss: 8.6873 - acc: 0.460 - ETA: 5s - loss: 8.6703 - acc: 0.461 - ETA: 5s - loss: 8.6721 - acc: 0.461 - ETA: 4s - loss: 8.6334 - acc: 0.463 - ETA: 4s - loss: 8.6293 - acc: 0.463 - ETA: 4s - loss: 8.6400 - acc: 0.463 - ETA: 4s - loss: 8.6147 - acc: 0.464 - ETA: 4s - loss: 8.6168 - acc: 0.464 - ETA: 4s - loss: 8.6138 - acc: 0.464 - ETA: 4s - loss: 8.6173 - acc: 0.464 - ETA: 4s - loss: 8.6384 - acc: 0.463 - ETA: 4s - loss: 8.6580 - acc: 0.462 - ETA: 3s - loss: 8.6540 - acc: 0.462 - ETA: 3s - loss: 8.6449 - acc: 0.463 - ETA: 3s - loss: 8.7196 - acc: 0.458 - ETA: 3s - loss: 8.7342 - acc: 0.457 - ETA: 3s - loss: 8.7333 - acc: 0.457 - ETA: 3s - loss: 8.6992 - acc: 0.459 - ETA: 3s - loss: 8.7198 - acc: 0.458 - ETA: 3s - loss: 8.7021 - acc: 0.459 - ETA: 3s - loss: 8.7065 - acc: 0.458 - ETA: 3s - loss: 8.7107 - acc: 0.458 - ETA: 2s - loss: 8.6857 - acc: 0.460 - ETA: 2s - loss: 8.6902 - acc: 0.460 - ETA: 2s - loss: 8.6905 - acc: 0.460 - ETA: 2s - loss: 8.6947 - acc: 0.459 - ETA: 2s - loss: 8.7099 - acc: 0.458 - ETA: 2s - loss: 8.6951 - acc: 0.459 - ETA: 2s - loss: 8.6917 - acc: 0.460 - ETA: 2s - loss: 8.7060 - acc: 0.459 - ETA: 2s - loss: 8.6957 - acc: 0.459 - ETA: 2s - loss: 8.6925 - acc: 0.460 - ETA: 1s - loss: 8.6763 - acc: 0.461 - ETA: 1s - loss: 8.6752 - acc: 0.461 - ETA: 1s - loss: 8.6600 - acc: 0.462 - ETA: 1s - loss: 8.6577 - acc: 0.462 - ETA: 1s - loss: 8.6428 - acc: 0.463 - ETA: 1s - loss: 8.6161 - acc: 0.464 - ETA: 1s - loss: 8.5942 - acc: 0.465 - ETA: 1s - loss: 8.6279 - acc: 0.463 - ETA: 1s - loss: 8.6377 - acc: 0.463 - ETA: 0s - loss: 8.6055 - acc: 0.465 - ETA: 0s - loss: 8.5826 - acc: 0.466 - ETA: 0s - loss: 8.6005 - acc: 0.465 - ETA: 0s - loss: 8.5867 - acc: 0.466 - ETA: 0s - loss: 8.6042 - acc: 0.465 - ETA: 0s - loss: 8.6109 - acc: 0.464 - ETA: 0s - loss: 8.6023 - acc: 0.465 - ETA: 0s - loss: 8.5961 - acc: 0.465 - ETA: 0s - loss: 8.6173 - acc: 0.464 - 7s 1ms/step - loss: 8.6009 - acc: 0.4654 - val_loss: 8.9998 - val_acc: 0.4359 Epoch 00015: val_loss improved from 9.00975 to 8.99983, saving model to saved_models/weights.best.InceptionV3.hdf5 Epoch 16/20 6680/6680 [==============================] - ETA: 6s - loss: 7.2531 - acc: 0.550 - ETA: 6s - loss: 8.5426 - acc: 0.470 - ETA: 6s - loss: 8.4543 - acc: 0.473 - ETA: 6s - loss: 8.3958 - acc: 0.477 - ETA: 6s - loss: 8.4369 - acc: 0.474 - ETA: 6s - loss: 8.6157 - acc: 0.463 - ETA: 6s - loss: 8.6973 - acc: 0.458 - ETA: 6s - loss: 8.6981 - acc: 0.458 - ETA: 6s - loss: 8.5555 - acc: 0.467 - ETA: 5s - loss: 8.5390 - acc: 0.468 - ETA: 5s - loss: 8.5686 - acc: 0.466 - ETA: 5s - loss: 8.5127 - acc: 0.470 - ETA: 5s - loss: 8.5522 - acc: 0.467 - ETA: 5s - loss: 8.6040 - acc: 0.464 - ETA: 5s - loss: 8.5676 - acc: 0.466 - ETA: 5s - loss: 8.6366 - acc: 0.462 - ETA: 5s - loss: 8.5519 - acc: 0.467 - ETA: 5s - loss: 8.6140 - acc: 0.463 - ETA: 5s - loss: 8.5848 - acc: 0.465 - ETA: 4s - loss: 8.5576 - acc: 0.467 - ETA: 4s - loss: 8.5569 - acc: 0.467 - ETA: 4s - loss: 8.6002 - acc: 0.465 - ETA: 4s - loss: 8.6117 - acc: 0.464 - ETA: 4s - loss: 8.6139 - acc: 0.464 - ETA: 4s - loss: 8.6626 - acc: 0.461 - ETA: 4s - loss: 8.6394 - acc: 0.462 - ETA: 4s - loss: 8.6537 - acc: 0.461 - ETA: 4s - loss: 8.6308 - acc: 0.463 - ETA: 4s - loss: 8.6151 - acc: 0.464 - ETA: 3s - loss: 8.6396 - acc: 0.462 - ETA: 3s - loss: 8.6313 - acc: 0.463 - ETA: 3s - loss: 8.6285 - acc: 0.463 - ETA: 3s - loss: 8.6259 - acc: 0.463 - ETA: 3s - loss: 8.6426 - acc: 0.462 - ETA: 3s - loss: 8.6490 - acc: 0.462 - ETA: 3s - loss: 8.6729 - acc: 0.460 - ETA: 3s - loss: 8.6827 - acc: 0.460 - ETA: 3s - loss: 8.6578 - acc: 0.461 - ETA: 2s - loss: 8.6447 - acc: 0.462 - ETA: 2s - loss: 8.6542 - acc: 0.461 - ETA: 2s - loss: 8.6633 - acc: 0.461 - ETA: 2s - loss: 8.6716 - acc: 0.460 - ETA: 2s - loss: 8.6724 - acc: 0.460 - ETA: 2s - loss: 8.6731 - acc: 0.460 - ETA: 2s - loss: 8.6581 - acc: 0.461 - ETA: 2s - loss: 8.6906 - acc: 0.459 - ETA: 2s - loss: 8.6602 - acc: 0.461 - ETA: 2s - loss: 8.6880 - acc: 0.459 - ETA: 1s - loss: 8.6643 - acc: 0.461 - ETA: 1s - loss: 8.6687 - acc: 0.460 - ETA: 1s - loss: 8.6530 - acc: 0.461 - ETA: 1s - loss: 8.6583 - acc: 0.461 - ETA: 1s - loss: 8.6401 - acc: 0.462 - ETA: 1s - loss: 8.6353 - acc: 0.462 - ETA: 1s - loss: 8.6160 - acc: 0.463 - ETA: 1s - loss: 8.6003 - acc: 0.464 - ETA: 1s - loss: 8.5854 - acc: 0.465 - ETA: 0s - loss: 8.5876 - acc: 0.465 - ETA: 0s - loss: 8.5897 - acc: 0.465 - ETA: 0s - loss: 8.5942 - acc: 0.464 - ETA: 0s - loss: 8.6013 - acc: 0.464 - ETA: 0s - loss: 8.5952 - acc: 0.464 - ETA: 0s - loss: 8.6158 - acc: 0.463 - ETA: 0s - loss: 8.6251 - acc: 0.462 - ETA: 0s - loss: 8.5941 - acc: 0.464 - ETA: 0s - loss: 8.5908 - acc: 0.465 - 7s 1ms/step - loss: 8.6038 - acc: 0.4642 - val_loss: 9.0935 - val_acc: 0.4263 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 6s - loss: 8.5426 - acc: 0.470 - ETA: 6s - loss: 8.7844 - acc: 0.455 - ETA: 6s - loss: 8.6983 - acc: 0.460 - ETA: 6s - loss: 8.2161 - acc: 0.490 - ETA: 6s - loss: 8.3459 - acc: 0.482 - ETA: 6s - loss: 8.3787 - acc: 0.480 - ETA: 6s - loss: 8.3560 - acc: 0.481 - ETA: 6s - loss: 8.3391 - acc: 0.482 - ETA: 6s - loss: 8.3000 - acc: 0.484 - ETA: 5s - loss: 8.4532 - acc: 0.475 - ETA: 5s - loss: 8.3881 - acc: 0.479 - ETA: 5s - loss: 8.3338 - acc: 0.482 - ETA: 5s - loss: 8.3251 - acc: 0.483 - ETA: 5s - loss: 8.3176 - acc: 0.483 - ETA: 5s - loss: 8.3863 - acc: 0.479 - ETA: 5s - loss: 8.3919 - acc: 0.478 - ETA: 5s - loss: 8.3727 - acc: 0.480 - ETA: 5s - loss: 8.3642 - acc: 0.480 - ETA: 5s - loss: 8.3736 - acc: 0.480 - ETA: 5s - loss: 8.3988 - acc: 0.478 - ETA: 5s - loss: 8.5438 - acc: 0.469 - ETA: 4s - loss: 8.5364 - acc: 0.469 - ETA: 4s - loss: 8.5017 - acc: 0.471 - ETA: 4s - loss: 8.5101 - acc: 0.471 - ETA: 4s - loss: 8.5243 - acc: 0.470 - ETA: 4s - loss: 8.5188 - acc: 0.470 - ETA: 4s - loss: 8.5555 - acc: 0.468 - ETA: 4s - loss: 8.5781 - acc: 0.467 - ETA: 4s - loss: 8.6102 - acc: 0.465 - ETA: 4s - loss: 8.5918 - acc: 0.466 - ETA: 3s - loss: 8.6110 - acc: 0.465 - ETA: 3s - loss: 8.6139 - acc: 0.465 - ETA: 3s - loss: 8.6121 - acc: 0.464 - ETA: 3s - loss: 8.6005 - acc: 0.465 - ETA: 3s - loss: 8.6442 - acc: 0.462 - ETA: 3s - loss: 8.6683 - acc: 0.461 - ETA: 3s - loss: 8.6474 - acc: 0.462 - ETA: 3s - loss: 8.6363 - acc: 0.463 - ETA: 3s - loss: 8.6504 - acc: 0.462 - ETA: 2s - loss: 8.6437 - acc: 0.463 - ETA: 2s - loss: 8.6412 - acc: 0.463 - ETA: 2s - loss: 8.6274 - acc: 0.464 - ETA: 2s - loss: 8.6366 - acc: 0.463 - ETA: 2s - loss: 8.6308 - acc: 0.463 - ETA: 2s - loss: 8.6360 - acc: 0.463 - ETA: 2s - loss: 8.6305 - acc: 0.463 - ETA: 2s - loss: 8.6492 - acc: 0.462 - ETA: 2s - loss: 8.6369 - acc: 0.463 - ETA: 1s - loss: 8.6317 - acc: 0.463 - ETA: 1s - loss: 8.6429 - acc: 0.463 - ETA: 1s - loss: 8.6291 - acc: 0.463 - ETA: 1s - loss: 8.6059 - acc: 0.465 - ETA: 1s - loss: 8.6260 - acc: 0.464 - ETA: 1s - loss: 8.6334 - acc: 0.463 - ETA: 1s - loss: 8.6347 - acc: 0.463 - ETA: 1s - loss: 8.6359 - acc: 0.463 - ETA: 1s - loss: 8.6512 - acc: 0.462 - ETA: 0s - loss: 8.6472 - acc: 0.462 - ETA: 0s - loss: 8.6509 - acc: 0.462 - ETA: 0s - loss: 8.6437 - acc: 0.462 - ETA: 0s - loss: 8.6236 - acc: 0.464 - ETA: 0s - loss: 8.6186 - acc: 0.464 - ETA: 0s - loss: 8.6098 - acc: 0.464 - ETA: 0s - loss: 8.6333 - acc: 0.463 - ETA: 0s - loss: 8.6269 - acc: 0.463 - ETA: 0s - loss: 8.6232 - acc: 0.463 - 7s 1ms/step - loss: 8.6237 - acc: 0.4639 - val_loss: 9.0811 - val_acc: 0.4287 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 7s - loss: 7.7367 - acc: 0.520 - ETA: 6s - loss: 8.5426 - acc: 0.470 - ETA: 6s - loss: 8.7046 - acc: 0.460 - ETA: 6s - loss: 8.7447 - acc: 0.457 - ETA: 6s - loss: 8.9345 - acc: 0.444 - ETA: 6s - loss: 8.7081 - acc: 0.458 - ETA: 6s - loss: 8.6845 - acc: 0.460 - ETA: 6s - loss: 8.5863 - acc: 0.466 - ETA: 6s - loss: 8.6710 - acc: 0.461 - ETA: 6s - loss: 8.5776 - acc: 0.467 - ETA: 5s - loss: 8.5890 - acc: 0.466 - ETA: 5s - loss: 8.5852 - acc: 0.466 - ETA: 5s - loss: 8.6191 - acc: 0.464 - ETA: 5s - loss: 8.6102 - acc: 0.465 - ETA: 5s - loss: 8.5843 - acc: 0.466 - ETA: 5s - loss: 8.5011 - acc: 0.471 - ETA: 5s - loss: 8.5510 - acc: 0.468 - ETA: 5s - loss: 8.5595 - acc: 0.468 - ETA: 5s - loss: 8.5756 - acc: 0.467 - ETA: 4s - loss: 8.5256 - acc: 0.470 - ETA: 4s - loss: 8.5340 - acc: 0.470 - ETA: 4s - loss: 8.4905 - acc: 0.472 - ETA: 4s - loss: 8.4762 - acc: 0.473 - ETA: 4s - loss: 8.4734 - acc: 0.473 - ETA: 4s - loss: 8.5090 - acc: 0.470 - ETA: 4s - loss: 8.5103 - acc: 0.470 - ETA: 4s - loss: 8.4995 - acc: 0.471 - ETA: 4s - loss: 8.5129 - acc: 0.470 - ETA: 4s - loss: 8.4861 - acc: 0.471 - ETA: 3s - loss: 8.4826 - acc: 0.472 - ETA: 3s - loss: 8.5002 - acc: 0.471 - ETA: 3s - loss: 8.5065 - acc: 0.470 - ETA: 3s - loss: 8.5290 - acc: 0.469 - ETA: 3s - loss: 8.5531 - acc: 0.467 - ETA: 3s - loss: 8.5391 - acc: 0.468 - ETA: 3s - loss: 8.5212 - acc: 0.469 - ETA: 3s - loss: 8.5392 - acc: 0.468 - ETA: 3s - loss: 8.5521 - acc: 0.467 - ETA: 2s - loss: 8.5683 - acc: 0.466 - ETA: 2s - loss: 8.5596 - acc: 0.467 - ETA: 2s - loss: 8.5592 - acc: 0.467 - ETA: 2s - loss: 8.5512 - acc: 0.468 - ETA: 2s - loss: 8.5360 - acc: 0.469 - ETA: 2s - loss: 8.5364 - acc: 0.468 - ETA: 2s - loss: 8.5401 - acc: 0.468 - ETA: 2s - loss: 8.5437 - acc: 0.468 - ETA: 2s - loss: 8.5436 - acc: 0.468 - ETA: 1s - loss: 8.5168 - acc: 0.470 - ETA: 1s - loss: 8.5140 - acc: 0.470 - ETA: 1s - loss: 8.5307 - acc: 0.469 - ETA: 1s - loss: 8.5436 - acc: 0.468 - ETA: 1s - loss: 8.5354 - acc: 0.469 - ETA: 1s - loss: 8.5233 - acc: 0.469 - ETA: 1s - loss: 8.5289 - acc: 0.469 - ETA: 1s - loss: 8.5390 - acc: 0.468 - ETA: 1s - loss: 8.5565 - acc: 0.467 - ETA: 1s - loss: 8.5586 - acc: 0.467 - ETA: 0s - loss: 8.5557 - acc: 0.467 - ETA: 0s - loss: 8.5841 - acc: 0.465 - ETA: 0s - loss: 8.5834 - acc: 0.465 - ETA: 0s - loss: 8.6065 - acc: 0.464 - ETA: 0s - loss: 8.5976 - acc: 0.464 - ETA: 0s - loss: 8.5968 - acc: 0.464 - ETA: 0s - loss: 8.6085 - acc: 0.464 - ETA: 0s - loss: 8.6199 - acc: 0.463 - ETA: 0s - loss: 8.6153 - acc: 0.463 - 7s 1ms/step - loss: 8.6134 - acc: 0.4638 - val_loss: 9.0189 - val_acc: 0.4335 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 6s - loss: 9.6565 - acc: 0.400 - ETA: 6s - loss: 8.6160 - acc: 0.465 - ETA: 6s - loss: 8.3229 - acc: 0.483 - ETA: 6s - loss: 8.4584 - acc: 0.475 - ETA: 6s - loss: 8.3463 - acc: 0.482 - ETA: 6s - loss: 8.2716 - acc: 0.486 - ETA: 6s - loss: 8.1721 - acc: 0.492 - ETA: 6s - loss: 8.2990 - acc: 0.485 - ETA: 6s - loss: 8.2724 - acc: 0.486 - ETA: 6s - loss: 8.3316 - acc: 0.483 - ETA: 6s - loss: 8.3948 - acc: 0.479 - ETA: 6s - loss: 8.3265 - acc: 0.483 - ETA: 6s - loss: 8.4299 - acc: 0.476 - ETA: 6s - loss: 8.4840 - acc: 0.473 - ETA: 6s - loss: 8.4772 - acc: 0.474 - ETA: 6s - loss: 8.4309 - acc: 0.476 - ETA: 6s - loss: 8.3711 - acc: 0.480 - ETA: 6s - loss: 8.3985 - acc: 0.478 - ETA: 6s - loss: 8.3976 - acc: 0.478 - ETA: 6s - loss: 8.4129 - acc: 0.478 - ETA: 6s - loss: 8.4114 - acc: 0.478 - ETA: 6s - loss: 8.4106 - acc: 0.477 - ETA: 6s - loss: 8.4231 - acc: 0.476 - ETA: 6s - loss: 8.3945 - acc: 0.478 - ETA: 5s - loss: 8.4083 - acc: 0.477 - ETA: 5s - loss: 8.4506 - acc: 0.474 - ETA: 5s - loss: 8.4301 - acc: 0.475 - ETA: 5s - loss: 8.4284 - acc: 0.476 - ETA: 5s - loss: 8.4713 - acc: 0.473 - ETA: 5s - loss: 8.4521 - acc: 0.474 - ETA: 4s - loss: 8.4135 - acc: 0.477 - ETA: 4s - loss: 8.3923 - acc: 0.478 - ETA: 4s - loss: 8.3578 - acc: 0.480 - ETA: 4s - loss: 8.3775 - acc: 0.479 - ETA: 4s - loss: 8.3960 - acc: 0.478 - ETA: 4s - loss: 8.4045 - acc: 0.477 - ETA: 4s - loss: 8.4301 - acc: 0.476 - ETA: 3s - loss: 8.4500 - acc: 0.475 - ETA: 3s - loss: 8.3844 - acc: 0.479 - ETA: 3s - loss: 8.4045 - acc: 0.477 - ETA: 3s - loss: 8.4354 - acc: 0.475 - ETA: 3s - loss: 8.4149 - acc: 0.477 - ETA: 3s - loss: 8.4516 - acc: 0.474 - ETA: 3s - loss: 8.4647 - acc: 0.474 - ETA: 2s - loss: 8.4592 - acc: 0.474 - ETA: 2s - loss: 8.4761 - acc: 0.473 - ETA: 2s - loss: 8.4947 - acc: 0.472 - ETA: 2s - loss: 8.4890 - acc: 0.472 - ETA: 2s - loss: 8.4769 - acc: 0.473 - ETA: 2s - loss: 8.4814 - acc: 0.473 - ETA: 2s - loss: 8.4826 - acc: 0.472 - ETA: 1s - loss: 8.4801 - acc: 0.473 - ETA: 1s - loss: 8.4904 - acc: 0.472 - ETA: 1s - loss: 8.4675 - acc: 0.473 - ETA: 1s - loss: 8.4566 - acc: 0.474 - ETA: 1s - loss: 8.4553 - acc: 0.474 - ETA: 1s - loss: 8.4766 - acc: 0.473 - ETA: 1s - loss: 8.4833 - acc: 0.472 - ETA: 0s - loss: 8.4898 - acc: 0.472 - ETA: 0s - loss: 8.5041 - acc: 0.471 - ETA: 0s - loss: 8.5206 - acc: 0.470 - ETA: 0s - loss: 8.5312 - acc: 0.470 - ETA: 0s - loss: 8.5340 - acc: 0.469 - ETA: 0s - loss: 8.5417 - acc: 0.469 - ETA: 0s - loss: 8.5268 - acc: 0.470 - ETA: 0s - loss: 8.5298 - acc: 0.470 - 9s 1ms/step - loss: 8.5338 - acc: 0.4698 - val_loss: 9.0307 - val_acc: 0.4287 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 7s - loss: 9.1873 - acc: 0.430 - ETA: 7s - loss: 8.7038 - acc: 0.460 - ETA: 6s - loss: 8.2748 - acc: 0.486 - ETA: 6s - loss: 7.8985 - acc: 0.510 - ETA: 6s - loss: 8.0919 - acc: 0.498 - ETA: 6s - loss: 8.1670 - acc: 0.493 - ETA: 6s - loss: 8.2897 - acc: 0.485 - ETA: 6s - loss: 8.3213 - acc: 0.483 - ETA: 6s - loss: 8.5071 - acc: 0.472 - ETA: 6s - loss: 8.5590 - acc: 0.469 - ETA: 6s - loss: 8.5282 - acc: 0.470 - ETA: 6s - loss: 8.5697 - acc: 0.468 - ETA: 5s - loss: 8.5676 - acc: 0.468 - ETA: 5s - loss: 8.5483 - acc: 0.468 - ETA: 5s - loss: 8.5587 - acc: 0.468 - ETA: 5s - loss: 8.5980 - acc: 0.465 - ETA: 5s - loss: 8.5301 - acc: 0.469 - ETA: 5s - loss: 8.5301 - acc: 0.469 - ETA: 5s - loss: 8.5307 - acc: 0.469 - ETA: 5s - loss: 8.5313 - acc: 0.469 - ETA: 5s - loss: 8.5395 - acc: 0.469 - ETA: 4s - loss: 8.5690 - acc: 0.467 - ETA: 4s - loss: 8.5260 - acc: 0.470 - ETA: 4s - loss: 8.5066 - acc: 0.471 - ETA: 4s - loss: 8.4822 - acc: 0.472 - ETA: 4s - loss: 8.4784 - acc: 0.473 - ETA: 4s - loss: 8.4569 - acc: 0.474 - ETA: 4s - loss: 8.4657 - acc: 0.473 - ETA: 4s - loss: 8.4461 - acc: 0.475 - ETA: 4s - loss: 8.4762 - acc: 0.473 - ETA: 3s - loss: 8.4991 - acc: 0.471 - ETA: 3s - loss: 8.5307 - acc: 0.470 - ETA: 3s - loss: 8.5598 - acc: 0.468 - ETA: 3s - loss: 8.5640 - acc: 0.467 - ETA: 3s - loss: 8.5570 - acc: 0.468 - ETA: 3s - loss: 8.5789 - acc: 0.466 - ETA: 3s - loss: 8.5780 - acc: 0.467 - ETA: 3s - loss: 8.5558 - acc: 0.468 - ETA: 3s - loss: 8.5720 - acc: 0.467 - ETA: 2s - loss: 8.5834 - acc: 0.466 - ETA: 2s - loss: 8.6138 - acc: 0.464 - ETA: 2s - loss: 8.6198 - acc: 0.464 - ETA: 2s - loss: 8.6183 - acc: 0.464 - ETA: 2s - loss: 8.6203 - acc: 0.464 - ETA: 2s - loss: 8.6150 - acc: 0.464 - ETA: 2s - loss: 8.5895 - acc: 0.466 - ETA: 2s - loss: 8.5748 - acc: 0.467 - ETA: 2s - loss: 8.5472 - acc: 0.468 - ETA: 1s - loss: 8.5307 - acc: 0.469 - ETA: 1s - loss: 8.5245 - acc: 0.470 - ETA: 1s - loss: 8.5438 - acc: 0.469 - ETA: 1s - loss: 8.5717 - acc: 0.467 - ETA: 1s - loss: 8.5620 - acc: 0.467 - ETA: 1s - loss: 8.5736 - acc: 0.467 - ETA: 1s - loss: 8.5733 - acc: 0.467 - ETA: 1s - loss: 8.5354 - acc: 0.469 - ETA: 1s - loss: 8.5213 - acc: 0.470 - ETA: 0s - loss: 8.5273 - acc: 0.470 - ETA: 0s - loss: 8.5166 - acc: 0.470 - ETA: 0s - loss: 8.5332 - acc: 0.469 - ETA: 0s - loss: 8.5572 - acc: 0.468 - ETA: 0s - loss: 8.5477 - acc: 0.468 - ETA: 0s - loss: 8.5604 - acc: 0.467 - ETA: 0s - loss: 8.5626 - acc: 0.467 - ETA: 0s - loss: 8.5747 - acc: 0.467 - ETA: 0s - loss: 8.5522 - acc: 0.468 - 8s 1ms/step - loss: 8.5511 - acc: 0.4686 - val_loss: 9.0860 - val_acc: 0.4263 Epoch 00020: val_loss did not improve we are at InceptionV3_model1 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 41s - loss: 6.0675 - acc: 0.0000e+ - ETA: 14s - loss: 8.7578 - acc: 0.0767 - ETA: 9s - loss: 9.3270 - acc: 0.1180 - ETA: 7s - loss: 9.0589 - acc: 0.162 - ETA: 5s - loss: 9.1189 - acc: 0.178 - ETA: 4s - loss: 8.7934 - acc: 0.217 - ETA: 4s - loss: 8.6764 - acc: 0.228 - ETA: 3s - loss: 8.3221 - acc: 0.262 - ETA: 3s - loss: 8.1088 - acc: 0.283 - ETA: 3s - loss: 7.9132 - acc: 0.301 - ETA: 2s - loss: 7.8033 - acc: 0.315 - ETA: 2s - loss: 7.6385 - acc: 0.329 - ETA: 2s - loss: 7.5278 - acc: 0.343 - ETA: 2s - loss: 7.3902 - acc: 0.358 - ETA: 1s - loss: 7.2719 - acc: 0.372 - ETA: 1s - loss: 7.1479 - acc: 0.383 - ETA: 1s - loss: 7.0850 - acc: 0.391 - ETA: 1s - loss: 7.0127 - acc: 0.396 - ETA: 1s - loss: 6.9502 - acc: 0.400 - ETA: 1s - loss: 6.8973 - acc: 0.405 - ETA: 1s - loss: 6.8028 - acc: 0.412 - ETA: 1s - loss: 6.7838 - acc: 0.414 - ETA: 0s - loss: 6.7167 - acc: 0.419 - ETA: 0s - loss: 6.6588 - acc: 0.424 - ETA: 0s - loss: 6.6267 - acc: 0.428 - ETA: 0s - loss: 6.5748 - acc: 0.433 - ETA: 0s - loss: 6.5488 - acc: 0.436 - ETA: 0s - loss: 6.4929 - acc: 0.441 - ETA: 0s - loss: 6.4469 - acc: 0.447 - ETA: 0s - loss: 6.3689 - acc: 0.452 - ETA: 0s - loss: 6.3321 - acc: 0.456 - ETA: 0s - loss: 6.2538 - acc: 0.462 - ETA: 0s - loss: 6.2008 - acc: 0.467 - 3s 443us/step - loss: 6.1830 - acc: 0.4702 - val_loss: 4.6974 - val_acc: 0.5725 Epoch 00001: val_loss improved from inf to 4.69740, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 1s - loss: 5.4079 - acc: 0.560 - ETA: 1s - loss: 4.0980 - acc: 0.656 - ETA: 1s - loss: 3.9088 - acc: 0.682 - ETA: 1s - loss: 3.8056 - acc: 0.690 - ETA: 1s - loss: 3.8180 - acc: 0.694 - ETA: 1s - loss: 3.8201 - acc: 0.695 - ETA: 1s - loss: 4.0804 - acc: 0.674 - ETA: 1s - loss: 3.9773 - acc: 0.680 - ETA: 1s - loss: 4.0074 - acc: 0.674 - ETA: 1s - loss: 3.9662 - acc: 0.677 - ETA: 1s - loss: 3.9756 - acc: 0.677 - ETA: 1s - loss: 3.9685 - acc: 0.674 - ETA: 1s - loss: 3.9399 - acc: 0.676 - ETA: 1s - loss: 3.9422 - acc: 0.675 - ETA: 1s - loss: 3.9552 - acc: 0.675 - ETA: 1s - loss: 3.9215 - acc: 0.678 - ETA: 1s - loss: 3.9027 - acc: 0.680 - ETA: 1s - loss: 3.8337 - acc: 0.685 - ETA: 0s - loss: 3.8162 - acc: 0.685 - ETA: 0s - loss: 3.8299 - acc: 0.684 - ETA: 0s - loss: 3.7707 - acc: 0.688 - ETA: 0s - loss: 3.7865 - acc: 0.686 - ETA: 0s - loss: 3.7690 - acc: 0.688 - ETA: 0s - loss: 3.7694 - acc: 0.687 - ETA: 0s - loss: 3.7980 - acc: 0.685 - ETA: 0s - loss: 3.8025 - acc: 0.684 - ETA: 0s - loss: 3.7567 - acc: 0.686 - ETA: 0s - loss: 3.7353 - acc: 0.686 - ETA: 0s - loss: 3.7211 - acc: 0.685 - ETA: 0s - loss: 3.6908 - acc: 0.686 - ETA: 0s - loss: 3.6425 - acc: 0.689 - ETA: 0s - loss: 3.6209 - acc: 0.691 - ETA: 0s - loss: 3.5943 - acc: 0.692 - 2s 341us/step - loss: 3.5798 - acc: 0.6924 - val_loss: 3.1609 - val_acc: 0.7126 Epoch 00002: val_loss improved from 4.69740 to 3.16093, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 2s - loss: 2.2466 - acc: 0.820 - ETA: 1s - loss: 2.8656 - acc: 0.793 - ETA: 1s - loss: 2.4695 - acc: 0.822 - ETA: 1s - loss: 2.5501 - acc: 0.808 - ETA: 1s - loss: 2.6695 - acc: 0.795 - ETA: 1s - loss: 2.6367 - acc: 0.795 - ETA: 1s - loss: 2.6267 - acc: 0.794 - ETA: 1s - loss: 2.6400 - acc: 0.792 - ETA: 1s - loss: 2.6530 - acc: 0.791 - ETA: 1s - loss: 2.6417 - acc: 0.790 - ETA: 1s - loss: 2.5886 - acc: 0.794 - ETA: 1s - loss: 2.6408 - acc: 0.788 - ETA: 1s - loss: 2.6325 - acc: 0.789 - ETA: 1s - loss: 2.6330 - acc: 0.788 - ETA: 1s - loss: 2.5969 - acc: 0.789 - ETA: 1s - loss: 2.6090 - acc: 0.785 - ETA: 1s - loss: 2.6131 - acc: 0.785 - ETA: 1s - loss: 2.6245 - acc: 0.784 - ETA: 0s - loss: 2.6383 - acc: 0.783 - ETA: 0s - loss: 2.6165 - acc: 0.784 - ETA: 0s - loss: 2.5728 - acc: 0.786 - ETA: 0s - loss: 2.5666 - acc: 0.785 - ETA: 0s - loss: 2.5920 - acc: 0.782 - ETA: 0s - loss: 2.5743 - acc: 0.783 - ETA: 0s - loss: 2.5762 - acc: 0.783 - ETA: 0s - loss: 2.5662 - acc: 0.783 - ETA: 0s - loss: 2.5386 - acc: 0.785 - ETA: 0s - loss: 2.5178 - acc: 0.787 - ETA: 0s - loss: 2.5043 - acc: 0.788 - ETA: 0s - loss: 2.4963 - acc: 0.789 - ETA: 0s - loss: 2.4951 - acc: 0.788 - ETA: 0s - loss: 2.5005 - acc: 0.787 - ETA: 0s - loss: 2.4771 - acc: 0.787 - 2s 340us/step - loss: 2.4736 - acc: 0.7876 - val_loss: 2.9334 - val_acc: 0.7126 Epoch 00003: val_loss improved from 3.16093 to 2.93339, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 2s - loss: 2.4056 - acc: 0.840 - ETA: 2s - loss: 2.3879 - acc: 0.826 - ETA: 1s - loss: 2.2475 - acc: 0.834 - ETA: 1s - loss: 2.2385 - acc: 0.830 - ETA: 1s - loss: 2.1318 - acc: 0.835 - ETA: 1s - loss: 2.1504 - acc: 0.835 - ETA: 1s - loss: 2.1335 - acc: 0.835 - ETA: 1s - loss: 2.1159 - acc: 0.836 - ETA: 1s - loss: 2.0861 - acc: 0.834 - ETA: 1s - loss: 2.0610 - acc: 0.835 - ETA: 1s - loss: 2.0882 - acc: 0.834 - ETA: 1s - loss: 2.1623 - acc: 0.831 - ETA: 1s - loss: 2.1006 - acc: 0.835 - ETA: 1s - loss: 2.1430 - acc: 0.831 - ETA: 1s - loss: 2.1206 - acc: 0.834 - ETA: 1s - loss: 2.1562 - acc: 0.833 - ETA: 1s - loss: 2.1183 - acc: 0.833 - ETA: 1s - loss: 2.1025 - acc: 0.833 - ETA: 0s - loss: 2.0701 - acc: 0.835 - ETA: 0s - loss: 2.0633 - acc: 0.835 - ETA: 0s - loss: 2.0436 - acc: 0.835 - ETA: 0s - loss: 2.0560 - acc: 0.834 - ETA: 0s - loss: 2.0402 - acc: 0.835 - ETA: 0s - loss: 2.0254 - acc: 0.836 - ETA: 0s - loss: 2.0116 - acc: 0.837 - ETA: 0s - loss: 2.0071 - acc: 0.835 - ETA: 0s - loss: 2.0103 - acc: 0.836 - ETA: 0s - loss: 2.0230 - acc: 0.834 - ETA: 0s - loss: 2.0428 - acc: 0.832 - ETA: 0s - loss: 2.0680 - acc: 0.828 - ETA: 0s - loss: 2.0701 - acc: 0.827 - ETA: 0s - loss: 2.0787 - acc: 0.825 - ETA: 0s - loss: 2.0857 - acc: 0.824 - 2s 344us/step - loss: 2.0882 - acc: 0.8240 - val_loss: 2.6019 - val_acc: 0.7234 Epoch 00004: val_loss improved from 2.93339 to 2.60188, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 2s - loss: 1.4940 - acc: 0.890 - ETA: 2s - loss: 1.5203 - acc: 0.890 - ETA: 1s - loss: 1.3809 - acc: 0.894 - ETA: 1s - loss: 1.5588 - acc: 0.881 - ETA: 1s - loss: 1.5852 - acc: 0.881 - ETA: 1s - loss: 1.5492 - acc: 0.874 - ETA: 1s - loss: 1.5145 - acc: 0.876 - ETA: 1s - loss: 1.6154 - acc: 0.870 - ETA: 1s - loss: 1.6932 - acc: 0.863 - ETA: 1s - loss: 1.6830 - acc: 0.862 - ETA: 1s - loss: 1.7084 - acc: 0.859 - ETA: 1s - loss: 1.6745 - acc: 0.860 - ETA: 1s - loss: 1.6587 - acc: 0.860 - ETA: 1s - loss: 1.6129 - acc: 0.863 - ETA: 1s - loss: 1.6353 - acc: 0.862 - ETA: 1s - loss: 1.6815 - acc: 0.859 - ETA: 1s - loss: 1.6915 - acc: 0.859 - ETA: 1s - loss: 1.6641 - acc: 0.860 - ETA: 0s - loss: 1.6744 - acc: 0.859 - ETA: 0s - loss: 1.6996 - acc: 0.859 - ETA: 0s - loss: 1.7058 - acc: 0.858 - ETA: 0s - loss: 1.7198 - acc: 0.857 - ETA: 0s - loss: 1.7261 - acc: 0.856 - ETA: 0s - loss: 1.7370 - acc: 0.856 - ETA: 0s - loss: 1.7279 - acc: 0.857 - ETA: 0s - loss: 1.7306 - acc: 0.856 - ETA: 0s - loss: 1.7264 - acc: 0.856 - ETA: 0s - loss: 1.7438 - acc: 0.855 - ETA: 0s - loss: 1.7384 - acc: 0.855 - ETA: 0s - loss: 1.7264 - acc: 0.856 - ETA: 0s - loss: 1.7309 - acc: 0.855 - ETA: 0s - loss: 1.7323 - acc: 0.856 - ETA: 0s - loss: 1.7274 - acc: 0.856 - 2s 367us/step - loss: 1.7496 - acc: 0.8549 - val_loss: 2.5484 - val_acc: 0.7425 Epoch 00005: val_loss improved from 2.60188 to 2.54838, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 6/20 6680/6680 [==============================] - ETA: 2s - loss: 1.0097 - acc: 0.890 - ETA: 2s - loss: 1.2570 - acc: 0.890 - ETA: 2s - loss: 1.3963 - acc: 0.890 - ETA: 1s - loss: 1.3235 - acc: 0.897 - ETA: 1s - loss: 1.3637 - acc: 0.896 - ETA: 1s - loss: 1.4195 - acc: 0.890 - ETA: 1s - loss: 1.4521 - acc: 0.888 - ETA: 1s - loss: 1.4233 - acc: 0.890 - ETA: 1s - loss: 1.4203 - acc: 0.887 - ETA: 1s - loss: 1.4428 - acc: 0.886 - ETA: 1s - loss: 1.4761 - acc: 0.883 - ETA: 1s - loss: 1.5483 - acc: 0.877 - ETA: 1s - loss: 1.5426 - acc: 0.875 - ETA: 1s - loss: 1.5542 - acc: 0.874 - ETA: 1s - loss: 1.5713 - acc: 0.873 - ETA: 1s - loss: 1.5516 - acc: 0.873 - ETA: 1s - loss: 1.5425 - acc: 0.874 - ETA: 1s - loss: 1.5680 - acc: 0.871 - ETA: 0s - loss: 1.5513 - acc: 0.871 - ETA: 0s - loss: 1.5481 - acc: 0.871 - ETA: 0s - loss: 1.5444 - acc: 0.872 - ETA: 0s - loss: 1.5453 - acc: 0.872 - ETA: 0s - loss: 1.5422 - acc: 0.872 - ETA: 0s - loss: 1.5649 - acc: 0.871 - ETA: 0s - loss: 1.5348 - acc: 0.873 - ETA: 0s - loss: 1.5311 - acc: 0.873 - ETA: 0s - loss: 1.5322 - acc: 0.874 - ETA: 0s - loss: 1.5370 - acc: 0.872 - ETA: 0s - loss: 1.5397 - acc: 0.872 - ETA: 0s - loss: 1.5473 - acc: 0.872 - ETA: 0s - loss: 1.5469 - acc: 0.873 - ETA: 0s - loss: 1.5559 - acc: 0.872 - ETA: 0s - loss: 1.5652 - acc: 0.872 - 2s 349us/step - loss: 1.5672 - acc: 0.8717 - val_loss: 2.4832 - val_acc: 0.7413 Epoch 00006: val_loss improved from 2.54838 to 2.48324, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 2s - loss: 1.7802 - acc: 0.860 - ETA: 2s - loss: 1.3179 - acc: 0.890 - ETA: 2s - loss: 1.5367 - acc: 0.880 - ETA: 1s - loss: 1.4770 - acc: 0.884 - ETA: 1s - loss: 1.3786 - acc: 0.888 - ETA: 1s - loss: 1.3239 - acc: 0.894 - ETA: 1s - loss: 1.3903 - acc: 0.891 - ETA: 1s - loss: 1.4324 - acc: 0.889 - ETA: 1s - loss: 1.4506 - acc: 0.889 - ETA: 1s - loss: 1.4584 - acc: 0.888 - ETA: 1s - loss: 1.5019 - acc: 0.887 - ETA: 1s - loss: 1.5005 - acc: 0.887 - ETA: 1s - loss: 1.4993 - acc: 0.889 - ETA: 1s - loss: 1.5108 - acc: 0.888 - ETA: 1s - loss: 1.5109 - acc: 0.888 - ETA: 1s - loss: 1.5000 - acc: 0.889 - ETA: 1s - loss: 1.4710 - acc: 0.891 - ETA: 1s - loss: 1.4530 - acc: 0.893 - ETA: 0s - loss: 1.4901 - acc: 0.890 - ETA: 0s - loss: 1.4612 - acc: 0.892 - ETA: 0s - loss: 1.4468 - acc: 0.891 - ETA: 0s - loss: 1.4321 - acc: 0.891 - ETA: 0s - loss: 1.4506 - acc: 0.890 - ETA: 0s - loss: 1.4407 - acc: 0.890 - ETA: 0s - loss: 1.4162 - acc: 0.891 - ETA: 0s - loss: 1.4183 - acc: 0.891 - ETA: 0s - loss: 1.4402 - acc: 0.890 - ETA: 0s - loss: 1.4509 - acc: 0.888 - ETA: 0s - loss: 1.4471 - acc: 0.887 - ETA: 0s - loss: 1.4450 - acc: 0.887 - ETA: 0s - loss: 1.4245 - acc: 0.888 - ETA: 0s - loss: 1.4264 - acc: 0.887 - ETA: 0s - loss: 1.4302 - acc: 0.887 - 2s 359us/step - loss: 1.4222 - acc: 0.8874 - val_loss: 2.2118 - val_acc: 0.7677 Epoch 00007: val_loss improved from 2.48324 to 2.21178, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 8/20 6680/6680 [==============================] - ETA: 2s - loss: 1.5070 - acc: 0.880 - ETA: 2s - loss: 1.4487 - acc: 0.890 - ETA: 1s - loss: 1.2457 - acc: 0.900 - ETA: 1s - loss: 1.3463 - acc: 0.890 - ETA: 1s - loss: 1.3610 - acc: 0.883 - ETA: 1s - loss: 1.3460 - acc: 0.881 - ETA: 1s - loss: 1.3586 - acc: 0.881 - ETA: 1s - loss: 1.3092 - acc: 0.885 - ETA: 1s - loss: 1.2756 - acc: 0.890 - ETA: 1s - loss: 1.2546 - acc: 0.891 - ETA: 1s - loss: 1.2439 - acc: 0.892 - ETA: 1s - loss: 1.2367 - acc: 0.893 - ETA: 1s - loss: 1.2032 - acc: 0.894 - ETA: 1s - loss: 1.1885 - acc: 0.894 - ETA: 1s - loss: 1.1757 - acc: 0.892 - ETA: 1s - loss: 1.1801 - acc: 0.891 - ETA: 1s - loss: 1.1619 - acc: 0.893 - ETA: 1s - loss: 1.1578 - acc: 0.895 - ETA: 0s - loss: 1.1510 - acc: 0.895 - ETA: 0s - loss: 1.1358 - acc: 0.897 - ETA: 0s - loss: 1.1072 - acc: 0.899 - ETA: 0s - loss: 1.1067 - acc: 0.899 - ETA: 0s - loss: 1.1000 - acc: 0.901 - ETA: 0s - loss: 1.0948 - acc: 0.901 - ETA: 0s - loss: 1.0823 - acc: 0.902 - ETA: 0s - loss: 1.0723 - acc: 0.902 - ETA: 0s - loss: 1.0785 - acc: 0.902 - ETA: 0s - loss: 1.0575 - acc: 0.904 - ETA: 0s - loss: 1.0470 - acc: 0.903 - ETA: 0s - loss: 1.0409 - acc: 0.904 - ETA: 0s - loss: 1.0340 - acc: 0.905 - ETA: 0s - loss: 1.0373 - acc: 0.905 - ETA: 0s - loss: 1.0403 - acc: 0.904 - 2s 344us/step - loss: 1.0378 - acc: 0.9046 - val_loss: 1.9014 - val_acc: 0.7832 Epoch 00008: val_loss improved from 2.21178 to 1.90138, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 9/20 6680/6680 [==============================] - ETA: 2s - loss: 0.8945 - acc: 0.920 - ETA: 2s - loss: 0.5675 - acc: 0.943 - ETA: 1s - loss: 0.7302 - acc: 0.924 - ETA: 1s - loss: 0.7313 - acc: 0.928 - ETA: 1s - loss: 0.7180 - acc: 0.927 - ETA: 1s - loss: 0.7763 - acc: 0.924 - ETA: 1s - loss: 0.7452 - acc: 0.928 - ETA: 1s - loss: 0.7672 - acc: 0.928 - ETA: 1s - loss: 0.7489 - acc: 0.930 - ETA: 1s - loss: 0.8012 - acc: 0.928 - ETA: 1s - loss: 0.7719 - acc: 0.930 - ETA: 1s - loss: 0.7718 - acc: 0.929 - ETA: 1s - loss: 0.7599 - acc: 0.930 - ETA: 1s - loss: 0.7606 - acc: 0.931 - ETA: 1s - loss: 0.7324 - acc: 0.933 - ETA: 1s - loss: 0.7091 - acc: 0.935 - ETA: 1s - loss: 0.7173 - acc: 0.935 - ETA: 1s - loss: 0.7090 - acc: 0.936 - ETA: 0s - loss: 0.6865 - acc: 0.938 - ETA: 0s - loss: 0.6861 - acc: 0.938 - ETA: 0s - loss: 0.6882 - acc: 0.939 - ETA: 0s - loss: 0.6924 - acc: 0.938 - ETA: 0s - loss: 0.7040 - acc: 0.937 - ETA: 0s - loss: 0.7212 - acc: 0.934 - ETA: 0s - loss: 0.7096 - acc: 0.935 - ETA: 0s - loss: 0.6950 - acc: 0.937 - ETA: 0s - loss: 0.6939 - acc: 0.936 - ETA: 0s - loss: 0.6971 - acc: 0.936 - ETA: 0s - loss: 0.6915 - acc: 0.936 - ETA: 0s - loss: 0.6977 - acc: 0.934 - ETA: 0s - loss: 0.7011 - acc: 0.934 - ETA: 0s - loss: 0.7020 - acc: 0.934 - ETA: 0s - loss: 0.6911 - acc: 0.934 - 2s 349us/step - loss: 0.6806 - acc: 0.9359 - val_loss: 1.5264 - val_acc: 0.8192 Epoch 00009: val_loss improved from 1.90138 to 1.52644, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 10/20 6680/6680 [==============================] - ETA: 1s - loss: 0.5910 - acc: 0.950 - ETA: 1s - loss: 0.5696 - acc: 0.946 - ETA: 1s - loss: 0.5836 - acc: 0.946 - ETA: 1s - loss: 0.6018 - acc: 0.942 - ETA: 1s - loss: 0.6150 - acc: 0.945 - ETA: 1s - loss: 0.5869 - acc: 0.949 - ETA: 1s - loss: 0.6205 - acc: 0.946 - ETA: 1s - loss: 0.6427 - acc: 0.945 - ETA: 1s - loss: 0.6191 - acc: 0.947 - ETA: 1s - loss: 0.6354 - acc: 0.947 - ETA: 1s - loss: 0.6183 - acc: 0.948 - ETA: 1s - loss: 0.6160 - acc: 0.948 - ETA: 1s - loss: 0.6191 - acc: 0.948 - ETA: 1s - loss: 0.6184 - acc: 0.947 - ETA: 1s - loss: 0.6107 - acc: 0.949 - ETA: 1s - loss: 0.6027 - acc: 0.950 - ETA: 1s - loss: 0.6240 - acc: 0.949 - ETA: 1s - loss: 0.6358 - acc: 0.948 - ETA: 0s - loss: 0.6257 - acc: 0.949 - ETA: 0s - loss: 0.6200 - acc: 0.950 - ETA: 0s - loss: 0.6071 - acc: 0.951 - ETA: 0s - loss: 0.5984 - acc: 0.951 - ETA: 0s - loss: 0.5896 - acc: 0.950 - ETA: 0s - loss: 0.5731 - acc: 0.951 - ETA: 0s - loss: 0.5688 - acc: 0.952 - ETA: 0s - loss: 0.5817 - acc: 0.950 - ETA: 0s - loss: 0.5795 - acc: 0.950 - ETA: 0s - loss: 0.5734 - acc: 0.951 - ETA: 0s - loss: 0.5814 - acc: 0.950 - ETA: 0s - loss: 0.5736 - acc: 0.949 - ETA: 0s - loss: 0.5714 - acc: 0.949 - ETA: 0s - loss: 0.5622 - acc: 0.949 - ETA: 0s - loss: 0.5614 - acc: 0.950 - 2s 329us/step - loss: 0.5560 - acc: 0.9501 - val_loss: 1.4713 - val_acc: 0.8120 Epoch 00010: val_loss improved from 1.52644 to 1.47134, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 11/20 6680/6680 [==============================] - ETA: 1s - loss: 0.5023 - acc: 0.960 - ETA: 1s - loss: 0.4038 - acc: 0.966 - ETA: 1s - loss: 0.3127 - acc: 0.970 - ETA: 1s - loss: 0.3889 - acc: 0.962 - ETA: 1s - loss: 0.3230 - acc: 0.968 - ETA: 1s - loss: 0.3275 - acc: 0.970 - ETA: 1s - loss: 0.3181 - acc: 0.970 - ETA: 1s - loss: 0.3458 - acc: 0.969 - ETA: 1s - loss: 0.3424 - acc: 0.970 - ETA: 1s - loss: 0.3459 - acc: 0.970 - ETA: 1s - loss: 0.3387 - acc: 0.968 - ETA: 1s - loss: 0.3512 - acc: 0.967 - ETA: 1s - loss: 0.3499 - acc: 0.968 - ETA: 1s - loss: 0.3831 - acc: 0.965 - ETA: 1s - loss: 0.3864 - acc: 0.964 - ETA: 1s - loss: 0.3951 - acc: 0.962 - ETA: 1s - loss: 0.3993 - acc: 0.961 - ETA: 0s - loss: 0.3997 - acc: 0.961 - ETA: 0s - loss: 0.4121 - acc: 0.960 - ETA: 0s - loss: 0.4074 - acc: 0.961 - ETA: 0s - loss: 0.3995 - acc: 0.961 - ETA: 0s - loss: 0.3998 - acc: 0.960 - ETA: 0s - loss: 0.4176 - acc: 0.958 - ETA: 0s - loss: 0.4107 - acc: 0.958 - ETA: 0s - loss: 0.4260 - acc: 0.956 - ETA: 0s - loss: 0.4239 - acc: 0.956 - ETA: 0s - loss: 0.4192 - acc: 0.957 - ETA: 0s - loss: 0.4186 - acc: 0.958 - ETA: 0s - loss: 0.4197 - acc: 0.957 - ETA: 0s - loss: 0.4248 - acc: 0.957 - ETA: 0s - loss: 0.4167 - acc: 0.957 - ETA: 0s - loss: 0.4156 - acc: 0.957 - ETA: 0s - loss: 0.4283 - acc: 0.956 - 2s 317us/step - loss: 0.4360 - acc: 0.9558 - val_loss: 1.5250 - val_acc: 0.7904 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 2s - loss: 0.1836 - acc: 0.970 - ETA: 1s - loss: 0.2018 - acc: 0.973 - ETA: 1s - loss: 0.1759 - acc: 0.980 - ETA: 1s - loss: 0.3338 - acc: 0.972 - ETA: 1s - loss: 0.4080 - acc: 0.967 - ETA: 1s - loss: 0.3977 - acc: 0.967 - ETA: 1s - loss: 0.3941 - acc: 0.966 - ETA: 1s - loss: 0.3851 - acc: 0.968 - ETA: 1s - loss: 0.4018 - acc: 0.967 - ETA: 1s - loss: 0.4006 - acc: 0.967 - ETA: 1s - loss: 0.3951 - acc: 0.967 - ETA: 1s - loss: 0.3914 - acc: 0.967 - ETA: 1s - loss: 0.3837 - acc: 0.968 - ETA: 1s - loss: 0.3783 - acc: 0.967 - ETA: 1s - loss: 0.3750 - acc: 0.967 - ETA: 1s - loss: 0.3822 - acc: 0.968 - ETA: 1s - loss: 0.3806 - acc: 0.967 - ETA: 0s - loss: 0.3779 - acc: 0.968 - ETA: 0s - loss: 0.3799 - acc: 0.968 - ETA: 0s - loss: 0.3703 - acc: 0.969 - ETA: 0s - loss: 0.3602 - acc: 0.970 - ETA: 0s - loss: 0.3635 - acc: 0.970 - ETA: 0s - loss: 0.3720 - acc: 0.968 - ETA: 0s - loss: 0.3749 - acc: 0.968 - ETA: 0s - loss: 0.3823 - acc: 0.968 - ETA: 0s - loss: 0.3806 - acc: 0.967 - ETA: 0s - loss: 0.3829 - acc: 0.967 - ETA: 0s - loss: 0.3782 - acc: 0.966 - ETA: 0s - loss: 0.3793 - acc: 0.966 - ETA: 0s - loss: 0.3863 - acc: 0.965 - ETA: 0s - loss: 0.3964 - acc: 0.965 - ETA: 0s - loss: 0.3962 - acc: 0.965 - ETA: 0s - loss: 0.3913 - acc: 0.965 - 2s 321us/step - loss: 0.3856 - acc: 0.9653 - val_loss: 1.5158 - val_acc: 0.8216 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 2s - loss: 0.1770 - acc: 0.980 - ETA: 2s - loss: 0.2688 - acc: 0.966 - ETA: 1s - loss: 0.3350 - acc: 0.966 - ETA: 1s - loss: 0.2785 - acc: 0.968 - ETA: 1s - loss: 0.2809 - acc: 0.967 - ETA: 1s - loss: 0.2724 - acc: 0.968 - ETA: 1s - loss: 0.2927 - acc: 0.969 - ETA: 1s - loss: 0.3187 - acc: 0.968 - ETA: 1s - loss: 0.3206 - acc: 0.969 - ETA: 1s - loss: 0.3092 - acc: 0.969 - ETA: 1s - loss: 0.3159 - acc: 0.968 - ETA: 1s - loss: 0.3036 - acc: 0.969 - ETA: 1s - loss: 0.2809 - acc: 0.971 - ETA: 1s - loss: 0.2997 - acc: 0.969 - ETA: 1s - loss: 0.2920 - acc: 0.970 - ETA: 1s - loss: 0.2865 - acc: 0.970 - ETA: 1s - loss: 0.2960 - acc: 0.970 - ETA: 0s - loss: 0.3100 - acc: 0.969 - ETA: 0s - loss: 0.3049 - acc: 0.970 - ETA: 0s - loss: 0.3168 - acc: 0.969 - ETA: 0s - loss: 0.3287 - acc: 0.967 - ETA: 0s - loss: 0.3285 - acc: 0.967 - ETA: 0s - loss: 0.3291 - acc: 0.967 - ETA: 0s - loss: 0.3228 - acc: 0.968 - ETA: 0s - loss: 0.3251 - acc: 0.968 - ETA: 0s - loss: 0.3193 - acc: 0.968 - ETA: 0s - loss: 0.3177 - acc: 0.968 - ETA: 0s - loss: 0.3207 - acc: 0.968 - ETA: 0s - loss: 0.3197 - acc: 0.968 - ETA: 0s - loss: 0.3180 - acc: 0.968 - ETA: 0s - loss: 0.3132 - acc: 0.969 - ETA: 0s - loss: 0.3113 - acc: 0.969 - ETA: 0s - loss: 0.3145 - acc: 0.969 - 2s 321us/step - loss: 0.3124 - acc: 0.9699 - val_loss: 1.3787 - val_acc: 0.8240 Epoch 00013: val_loss improved from 1.47134 to 1.37872, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 14/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0477 - acc: 0.990 - ETA: 1s - loss: 0.2520 - acc: 0.980 - ETA: 1s - loss: 0.1952 - acc: 0.984 - ETA: 1s - loss: 0.1720 - acc: 0.985 - ETA: 1s - loss: 0.2363 - acc: 0.981 - ETA: 1s - loss: 0.2837 - acc: 0.978 - ETA: 1s - loss: 0.2554 - acc: 0.980 - ETA: 1s - loss: 0.2478 - acc: 0.979 - ETA: 1s - loss: 0.2686 - acc: 0.977 - ETA: 1s - loss: 0.2663 - acc: 0.978 - ETA: 1s - loss: 0.2960 - acc: 0.976 - ETA: 1s - loss: 0.3054 - acc: 0.976 - ETA: 1s - loss: 0.3143 - acc: 0.976 - ETA: 1s - loss: 0.3127 - acc: 0.975 - ETA: 1s - loss: 0.3042 - acc: 0.976 - ETA: 1s - loss: 0.3121 - acc: 0.975 - ETA: 1s - loss: 0.3086 - acc: 0.975 - ETA: 0s - loss: 0.2975 - acc: 0.976 - ETA: 0s - loss: 0.2905 - acc: 0.977 - ETA: 0s - loss: 0.2921 - acc: 0.977 - ETA: 0s - loss: 0.2837 - acc: 0.977 - ETA: 0s - loss: 0.2909 - acc: 0.976 - ETA: 0s - loss: 0.2865 - acc: 0.976 - ETA: 0s - loss: 0.2835 - acc: 0.976 - ETA: 0s - loss: 0.2842 - acc: 0.976 - ETA: 0s - loss: 0.2849 - acc: 0.976 - ETA: 0s - loss: 0.2801 - acc: 0.976 - ETA: 0s - loss: 0.2819 - acc: 0.976 - ETA: 0s - loss: 0.2845 - acc: 0.975 - ETA: 0s - loss: 0.2877 - acc: 0.975 - ETA: 0s - loss: 0.2906 - acc: 0.974 - ETA: 0s - loss: 0.2876 - acc: 0.974 - ETA: 0s - loss: 0.2912 - acc: 0.974 - 2s 322us/step - loss: 0.2914 - acc: 0.9749 - val_loss: 1.3632 - val_acc: 0.8407 Epoch 00014: val_loss improved from 1.37872 to 1.36321, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 15/20 6680/6680 [==============================] - ETA: 1s - loss: 0.3230 - acc: 0.980 - ETA: 1s - loss: 0.2718 - acc: 0.983 - ETA: 1s - loss: 0.1967 - acc: 0.988 - ETA: 1s - loss: 0.2583 - acc: 0.982 - ETA: 1s - loss: 0.2291 - acc: 0.983 - ETA: 1s - loss: 0.2376 - acc: 0.982 - ETA: 1s - loss: 0.2509 - acc: 0.982 - ETA: 1s - loss: 0.2704 - acc: 0.981 - ETA: 1s - loss: 0.2776 - acc: 0.980 - ETA: 1s - loss: 0.2596 - acc: 0.981 - ETA: 1s - loss: 0.2698 - acc: 0.979 - ETA: 1s - loss: 0.2859 - acc: 0.977 - ETA: 1s - loss: 0.2846 - acc: 0.977 - ETA: 1s - loss: 0.2743 - acc: 0.976 - ETA: 1s - loss: 0.2711 - acc: 0.976 - ETA: 1s - loss: 0.2712 - acc: 0.976 - ETA: 1s - loss: 0.2587 - acc: 0.976 - ETA: 0s - loss: 0.2591 - acc: 0.976 - ETA: 0s - loss: 0.2556 - acc: 0.976 - ETA: 0s - loss: 0.2620 - acc: 0.975 - ETA: 0s - loss: 0.2618 - acc: 0.976 - ETA: 0s - loss: 0.2633 - acc: 0.975 - ETA: 0s - loss: 0.2562 - acc: 0.976 - ETA: 0s - loss: 0.2553 - acc: 0.976 - ETA: 0s - loss: 0.2592 - acc: 0.975 - ETA: 0s - loss: 0.2544 - acc: 0.975 - ETA: 0s - loss: 0.2493 - acc: 0.975 - ETA: 0s - loss: 0.2472 - acc: 0.975 - ETA: 0s - loss: 0.2471 - acc: 0.975 - ETA: 0s - loss: 0.2433 - acc: 0.975 - ETA: 0s - loss: 0.2433 - acc: 0.976 - ETA: 0s - loss: 0.2379 - acc: 0.976 - ETA: 0s - loss: 0.2408 - acc: 0.976 - 2s 322us/step - loss: 0.2424 - acc: 0.9760 - val_loss: 1.4850 - val_acc: 0.8287 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 1s - loss: 0.5071 - acc: 0.950 - ETA: 1s - loss: 0.2767 - acc: 0.963 - ETA: 1s - loss: 0.3571 - acc: 0.962 - ETA: 1s - loss: 0.2689 - acc: 0.967 - ETA: 1s - loss: 0.2575 - acc: 0.968 - ETA: 1s - loss: 0.2706 - acc: 0.968 - ETA: 1s - loss: 0.2594 - acc: 0.969 - ETA: 1s - loss: 0.2311 - acc: 0.971 - ETA: 1s - loss: 0.2198 - acc: 0.972 - ETA: 1s - loss: 0.1989 - acc: 0.974 - ETA: 1s - loss: 0.1893 - acc: 0.976 - ETA: 1s - loss: 0.1874 - acc: 0.977 - ETA: 1s - loss: 0.1812 - acc: 0.976 - ETA: 1s - loss: 0.1864 - acc: 0.975 - ETA: 1s - loss: 0.1824 - acc: 0.975 - ETA: 1s - loss: 0.1853 - acc: 0.976 - ETA: 1s - loss: 0.1916 - acc: 0.975 - ETA: 0s - loss: 0.2030 - acc: 0.974 - ETA: 0s - loss: 0.1957 - acc: 0.974 - ETA: 0s - loss: 0.1893 - acc: 0.975 - ETA: 0s - loss: 0.1841 - acc: 0.976 - ETA: 0s - loss: 0.1832 - acc: 0.976 - ETA: 0s - loss: 0.1754 - acc: 0.977 - ETA: 0s - loss: 0.1688 - acc: 0.978 - ETA: 0s - loss: 0.1628 - acc: 0.979 - ETA: 0s - loss: 0.1643 - acc: 0.979 - ETA: 0s - loss: 0.1617 - acc: 0.979 - ETA: 0s - loss: 0.1645 - acc: 0.978 - ETA: 0s - loss: 0.1635 - acc: 0.979 - ETA: 0s - loss: 0.1630 - acc: 0.979 - ETA: 0s - loss: 0.1639 - acc: 0.979 - ETA: 0s - loss: 0.1603 - acc: 0.979 - ETA: 0s - loss: 0.1602 - acc: 0.980 - 2s 319us/step - loss: 0.1595 - acc: 0.9801 - val_loss: 1.3972 - val_acc: 0.8371 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0662 - acc: 0.990 - ETA: 1s - loss: 0.1313 - acc: 0.990 - ETA: 1s - loss: 0.1423 - acc: 0.988 - ETA: 1s - loss: 0.1031 - acc: 0.991 - ETA: 1s - loss: 0.0836 - acc: 0.991 - ETA: 1s - loss: 0.0840 - acc: 0.991 - ETA: 1s - loss: 0.1009 - acc: 0.990 - ETA: 1s - loss: 0.1129 - acc: 0.988 - ETA: 1s - loss: 0.1188 - acc: 0.988 - ETA: 1s - loss: 0.1091 - acc: 0.989 - ETA: 1s - loss: 0.1016 - acc: 0.989 - ETA: 1s - loss: 0.1058 - acc: 0.988 - ETA: 1s - loss: 0.1047 - acc: 0.988 - ETA: 1s - loss: 0.1271 - acc: 0.987 - ETA: 1s - loss: 0.1190 - acc: 0.988 - ETA: 1s - loss: 0.1255 - acc: 0.988 - ETA: 1s - loss: 0.1240 - acc: 0.988 - ETA: 0s - loss: 0.1238 - acc: 0.988 - ETA: 0s - loss: 0.1219 - acc: 0.987 - ETA: 0s - loss: 0.1383 - acc: 0.985 - ETA: 0s - loss: 0.1373 - acc: 0.985 - ETA: 0s - loss: 0.1397 - acc: 0.985 - ETA: 0s - loss: 0.1376 - acc: 0.985 - ETA: 0s - loss: 0.1354 - acc: 0.985 - ETA: 0s - loss: 0.1350 - acc: 0.985 - ETA: 0s - loss: 0.1343 - acc: 0.985 - ETA: 0s - loss: 0.1363 - acc: 0.985 - ETA: 0s - loss: 0.1434 - acc: 0.985 - ETA: 0s - loss: 0.1443 - acc: 0.984 - ETA: 0s - loss: 0.1430 - acc: 0.984 - ETA: 0s - loss: 0.1397 - acc: 0.984 - ETA: 0s - loss: 0.1359 - acc: 0.985 - ETA: 0s - loss: 0.1319 - acc: 0.985 - 2s 318us/step - loss: 0.1311 - acc: 0.9858 - val_loss: 1.3829 - val_acc: 0.8359 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0112 - acc: 1.000 - ETA: 1s - loss: 0.2752 - acc: 0.973 - ETA: 1s - loss: 0.1979 - acc: 0.982 - ETA: 1s - loss: 0.1424 - acc: 0.987 - ETA: 1s - loss: 0.1474 - acc: 0.986 - ETA: 1s - loss: 0.1362 - acc: 0.988 - ETA: 1s - loss: 0.1404 - acc: 0.988 - ETA: 1s - loss: 0.1220 - acc: 0.990 - ETA: 1s - loss: 0.1364 - acc: 0.989 - ETA: 1s - loss: 0.1221 - acc: 0.990 - ETA: 1s - loss: 0.1109 - acc: 0.991 - ETA: 1s - loss: 0.1183 - acc: 0.990 - ETA: 1s - loss: 0.1213 - acc: 0.989 - ETA: 1s - loss: 0.1256 - acc: 0.988 - ETA: 1s - loss: 0.1229 - acc: 0.988 - ETA: 1s - loss: 0.1235 - acc: 0.988 - ETA: 1s - loss: 0.1233 - acc: 0.987 - ETA: 0s - loss: 0.1188 - acc: 0.987 - ETA: 0s - loss: 0.1168 - acc: 0.988 - ETA: 0s - loss: 0.1110 - acc: 0.988 - ETA: 0s - loss: 0.1056 - acc: 0.989 - ETA: 0s - loss: 0.1007 - acc: 0.989 - ETA: 0s - loss: 0.0966 - acc: 0.990 - ETA: 0s - loss: 0.0928 - acc: 0.990 - ETA: 0s - loss: 0.0929 - acc: 0.989 - ETA: 0s - loss: 0.0896 - acc: 0.989 - ETA: 0s - loss: 0.0871 - acc: 0.989 - ETA: 0s - loss: 0.0859 - acc: 0.989 - ETA: 0s - loss: 0.0838 - acc: 0.989 - ETA: 0s - loss: 0.0814 - acc: 0.989 - ETA: 0s - loss: 0.0800 - acc: 0.989 - ETA: 0s - loss: 0.0775 - acc: 0.990 - ETA: 0s - loss: 0.0771 - acc: 0.990 - 2s 320us/step - loss: 0.0785 - acc: 0.9897 - val_loss: 1.2990 - val_acc: 0.8335 Epoch 00018: val_loss improved from 1.36321 to 1.29899, saving model to saved_models/weights.best.InceptionV31.hdf5 Epoch 19/20 6680/6680 [==============================] - ETA: 1s - loss: 6.9640e-04 - acc: 1.000 - ETA: 1s - loss: 8.2542e-04 - acc: 1.000 - ETA: 1s - loss: 0.0059 - acc: 0.9980 - ETA: 1s - loss: 0.0279 - acc: 0.997 - ETA: 1s - loss: 0.0238 - acc: 0.996 - ETA: 1s - loss: 0.0195 - acc: 0.997 - ETA: 1s - loss: 0.0223 - acc: 0.996 - ETA: 1s - loss: 0.0201 - acc: 0.996 - ETA: 1s - loss: 0.0215 - acc: 0.995 - ETA: 1s - loss: 0.0196 - acc: 0.996 - ETA: 1s - loss: 0.0214 - acc: 0.995 - ETA: 1s - loss: 0.0216 - acc: 0.995 - ETA: 1s - loss: 0.0206 - acc: 0.995 - ETA: 1s - loss: 0.0209 - acc: 0.995 - ETA: 1s - loss: 0.0211 - acc: 0.995 - ETA: 1s - loss: 0.0207 - acc: 0.995 - ETA: 1s - loss: 0.0195 - acc: 0.995 - ETA: 0s - loss: 0.0186 - acc: 0.996 - ETA: 0s - loss: 0.0200 - acc: 0.995 - ETA: 0s - loss: 0.0232 - acc: 0.995 - ETA: 0s - loss: 0.0227 - acc: 0.995 - ETA: 0s - loss: 0.0234 - acc: 0.995 - ETA: 0s - loss: 0.0254 - acc: 0.994 - ETA: 0s - loss: 0.0244 - acc: 0.995 - ETA: 0s - loss: 0.0258 - acc: 0.994 - ETA: 0s - loss: 0.0255 - acc: 0.994 - ETA: 0s - loss: 0.0272 - acc: 0.994 - ETA: 0s - loss: 0.0268 - acc: 0.994 - ETA: 0s - loss: 0.0259 - acc: 0.994 - ETA: 0s - loss: 0.0250 - acc: 0.994 - ETA: 0s - loss: 0.0286 - acc: 0.994 - ETA: 0s - loss: 0.0328 - acc: 0.993 - ETA: 0s - loss: 0.0342 - acc: 0.993 - 2s 319us/step - loss: 0.0371 - acc: 0.9928 - val_loss: 1.4313 - val_acc: 0.8228 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0156 - acc: 0.990 - ETA: 1s - loss: 0.0090 - acc: 0.996 - ETA: 1s - loss: 0.0130 - acc: 0.994 - ETA: 1s - loss: 0.0352 - acc: 0.992 - ETA: 1s - loss: 0.0348 - acc: 0.991 - ETA: 1s - loss: 0.0321 - acc: 0.991 - ETA: 1s - loss: 0.0280 - acc: 0.992 - ETA: 1s - loss: 0.0266 - acc: 0.992 - ETA: 1s - loss: 0.0241 - acc: 0.992 - ETA: 1s - loss: 0.0219 - acc: 0.993 - ETA: 1s - loss: 0.0199 - acc: 0.993 - ETA: 1s - loss: 0.0182 - acc: 0.994 - ETA: 1s - loss: 0.0196 - acc: 0.994 - ETA: 1s - loss: 0.0181 - acc: 0.994 - ETA: 1s - loss: 0.0169 - acc: 0.994 - ETA: 1s - loss: 0.0159 - acc: 0.995 - ETA: 1s - loss: 0.0150 - acc: 0.995 - ETA: 0s - loss: 0.0141 - acc: 0.995 - ETA: 0s - loss: 0.0134 - acc: 0.995 - ETA: 0s - loss: 0.0127 - acc: 0.996 - ETA: 0s - loss: 0.0124 - acc: 0.996 - ETA: 0s - loss: 0.0122 - acc: 0.996 - ETA: 0s - loss: 0.0125 - acc: 0.996 - ETA: 0s - loss: 0.0121 - acc: 0.996 - ETA: 0s - loss: 0.0151 - acc: 0.995 - ETA: 0s - loss: 0.0145 - acc: 0.995 - ETA: 0s - loss: 0.0149 - acc: 0.995 - ETA: 0s - loss: 0.0173 - acc: 0.995 - ETA: 0s - loss: 0.0199 - acc: 0.995 - ETA: 0s - loss: 0.0214 - acc: 0.995 - ETA: 0s - loss: 0.0236 - acc: 0.994 - ETA: 0s - loss: 0.0237 - acc: 0.994 - ETA: 0s - loss: 0.0248 - acc: 0.994 - 2s 321us/step - loss: 0.0258 - acc: 0.9943 - val_loss: 1.2344 - val_acc: 0.8383 Epoch 00020: val_loss improved from 1.29899 to 1.23437, saving model to saved_models/weights.best.InceptionV31.hdf5 we are at InceptionV3_model2 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 38s - loss: 5.3845 - acc: 0.0000e+ - ETA: 10s - loss: 5.0777 - acc: 0.0800 - ETA: 5s - loss: 4.4842 - acc: 0.1514 - ETA: 4s - loss: 4.0176 - acc: 0.223 - ETA: 3s - loss: 3.6312 - acc: 0.287 - ETA: 2s - loss: 3.3025 - acc: 0.349 - ETA: 2s - loss: 3.0412 - acc: 0.393 - ETA: 2s - loss: 2.8180 - acc: 0.433 - ETA: 1s - loss: 2.6454 - acc: 0.460 - ETA: 1s - loss: 2.4864 - acc: 0.490 - ETA: 1s - loss: 2.3533 - acc: 0.511 - ETA: 1s - loss: 2.2441 - acc: 0.526 - ETA: 1s - loss: 2.1410 - acc: 0.544 - ETA: 0s - loss: 2.0466 - acc: 0.560 - ETA: 0s - loss: 1.9672 - acc: 0.573 - ETA: 0s - loss: 1.8871 - acc: 0.587 - ETA: 0s - loss: 1.8138 - acc: 0.599 - ETA: 0s - loss: 1.7572 - acc: 0.608 - ETA: 0s - loss: 1.7015 - acc: 0.617 - ETA: 0s - loss: 1.6467 - acc: 0.628 - ETA: 0s - loss: 1.5987 - acc: 0.635 - ETA: 0s - loss: 1.5534 - acc: 0.644 - 2s 303us/step - loss: 1.5158 - acc: 0.6512 - val_loss: 0.6843 - val_acc: 0.8024 Epoch 00001: val_loss improved from inf to 0.68434, saving model to saved_models/weights.best.InceptionV32.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 1s - loss: 0.5140 - acc: 0.850 - ETA: 1s - loss: 0.4813 - acc: 0.845 - ETA: 1s - loss: 0.4765 - acc: 0.848 - ETA: 1s - loss: 0.4916 - acc: 0.842 - ETA: 1s - loss: 0.4860 - acc: 0.845 - ETA: 0s - loss: 0.4982 - acc: 0.841 - ETA: 0s - loss: 0.4936 - acc: 0.842 - ETA: 0s - loss: 0.4990 - acc: 0.842 - ETA: 0s - loss: 0.5020 - acc: 0.842 - ETA: 0s - loss: 0.5047 - acc: 0.842 - ETA: 0s - loss: 0.4995 - acc: 0.842 - ETA: 0s - loss: 0.5054 - acc: 0.839 - ETA: 0s - loss: 0.5004 - acc: 0.840 - ETA: 0s - loss: 0.4962 - acc: 0.842 - ETA: 0s - loss: 0.4925 - acc: 0.843 - ETA: 0s - loss: 0.4892 - acc: 0.845 - ETA: 0s - loss: 0.4933 - acc: 0.843 - ETA: 0s - loss: 0.4928 - acc: 0.845 - ETA: 0s - loss: 0.4939 - acc: 0.845 - ETA: 0s - loss: 0.4984 - acc: 0.843 - ETA: 0s - loss: 0.4942 - acc: 0.844 - ETA: 0s - loss: 0.4907 - acc: 0.845 - 1s 209us/step - loss: 0.4894 - acc: 0.8457 - val_loss: 0.5720 - val_acc: 0.8240 Epoch 00002: val_loss improved from 0.68434 to 0.57196, saving model to saved_models/weights.best.InceptionV32.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 1s - loss: 0.2858 - acc: 0.930 - ETA: 1s - loss: 0.2716 - acc: 0.922 - ETA: 1s - loss: 0.2864 - acc: 0.917 - ETA: 1s - loss: 0.2870 - acc: 0.921 - ETA: 1s - loss: 0.2994 - acc: 0.913 - ETA: 0s - loss: 0.3133 - acc: 0.909 - ETA: 0s - loss: 0.3218 - acc: 0.907 - ETA: 0s - loss: 0.3282 - acc: 0.906 - ETA: 0s - loss: 0.3306 - acc: 0.904 - ETA: 0s - loss: 0.3349 - acc: 0.901 - ETA: 0s - loss: 0.3350 - acc: 0.901 - ETA: 0s - loss: 0.3309 - acc: 0.902 - ETA: 0s - loss: 0.3328 - acc: 0.902 - ETA: 0s - loss: 0.3287 - acc: 0.903 - ETA: 0s - loss: 0.3364 - acc: 0.898 - ETA: 0s - loss: 0.3360 - acc: 0.897 - ETA: 0s - loss: 0.3347 - acc: 0.897 - ETA: 0s - loss: 0.3347 - acc: 0.897 - ETA: 0s - loss: 0.3388 - acc: 0.895 - ETA: 0s - loss: 0.3382 - acc: 0.895 - ETA: 0s - loss: 0.3376 - acc: 0.895 - ETA: 0s - loss: 0.3365 - acc: 0.896 - 1s 211us/step - loss: 0.3357 - acc: 0.8963 - val_loss: 0.5343 - val_acc: 0.8359 Epoch 00003: val_loss improved from 0.57196 to 0.53429, saving model to saved_models/weights.best.InceptionV32.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 1s - loss: 0.2313 - acc: 0.940 - ETA: 1s - loss: 0.1784 - acc: 0.952 - ETA: 1s - loss: 0.2053 - acc: 0.945 - ETA: 1s - loss: 0.2178 - acc: 0.939 - ETA: 1s - loss: 0.2211 - acc: 0.933 - ETA: 0s - loss: 0.2243 - acc: 0.930 - ETA: 0s - loss: 0.2380 - acc: 0.927 - ETA: 0s - loss: 0.2352 - acc: 0.929 - ETA: 0s - loss: 0.2367 - acc: 0.929 - ETA: 0s - loss: 0.2345 - acc: 0.930 - ETA: 0s - loss: 0.2291 - acc: 0.933 - ETA: 0s - loss: 0.2312 - acc: 0.931 - ETA: 0s - loss: 0.2326 - acc: 0.929 - ETA: 0s - loss: 0.2307 - acc: 0.929 - ETA: 0s - loss: 0.2323 - acc: 0.929 - ETA: 0s - loss: 0.2382 - acc: 0.926 - ETA: 0s - loss: 0.2394 - acc: 0.926 - ETA: 0s - loss: 0.2408 - acc: 0.925 - ETA: 0s - loss: 0.2443 - acc: 0.922 - ETA: 0s - loss: 0.2477 - acc: 0.921 - ETA: 0s - loss: 0.2490 - acc: 0.921 - ETA: 0s - loss: 0.2479 - acc: 0.921 - ETA: 0s - loss: 0.2503 - acc: 0.920 - 1s 217us/step - loss: 0.2497 - acc: 0.9205 - val_loss: 0.5366 - val_acc: 0.8371 Epoch 00004: val_loss did not improve Epoch 5/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0846 - acc: 0.990 - ETA: 1s - loss: 0.1236 - acc: 0.962 - ETA: 1s - loss: 0.1411 - acc: 0.958 - ETA: 1s - loss: 0.1482 - acc: 0.956 - ETA: 1s - loss: 0.1516 - acc: 0.953 - ETA: 0s - loss: 0.1576 - acc: 0.953 - ETA: 0s - loss: 0.1649 - acc: 0.948 - ETA: 0s - loss: 0.1635 - acc: 0.949 - ETA: 0s - loss: 0.1670 - acc: 0.948 - ETA: 0s - loss: 0.1718 - acc: 0.947 - ETA: 0s - loss: 0.1745 - acc: 0.946 - ETA: 0s - loss: 0.1808 - acc: 0.945 - ETA: 0s - loss: 0.1867 - acc: 0.943 - ETA: 0s - loss: 0.1891 - acc: 0.941 - ETA: 0s - loss: 0.1943 - acc: 0.939 - ETA: 0s - loss: 0.1918 - acc: 0.940 - ETA: 0s - loss: 0.1899 - acc: 0.940 - ETA: 0s - loss: 0.1926 - acc: 0.939 - ETA: 0s - loss: 0.1926 - acc: 0.939 - ETA: 0s - loss: 0.1943 - acc: 0.939 - ETA: 0s - loss: 0.1943 - acc: 0.938 - ETA: 0s - loss: 0.1951 - acc: 0.938 - 1s 211us/step - loss: 0.1933 - acc: 0.9389 - val_loss: 0.5800 - val_acc: 0.8371 Epoch 00005: val_loss did not improve Epoch 6/20 6680/6680 [==============================] - ETA: 1s - loss: 0.1202 - acc: 0.980 - ETA: 1s - loss: 0.1020 - acc: 0.975 - ETA: 1s - loss: 0.1140 - acc: 0.972 - ETA: 1s - loss: 0.1242 - acc: 0.970 - ETA: 1s - loss: 0.1281 - acc: 0.965 - ETA: 1s - loss: 0.1278 - acc: 0.968 - ETA: 0s - loss: 0.1267 - acc: 0.968 - ETA: 0s - loss: 0.1291 - acc: 0.965 - ETA: 0s - loss: 0.1337 - acc: 0.962 - ETA: 0s - loss: 0.1321 - acc: 0.963 - ETA: 0s - loss: 0.1344 - acc: 0.962 - ETA: 0s - loss: 0.1434 - acc: 0.958 - ETA: 0s - loss: 0.1452 - acc: 0.956 - ETA: 0s - loss: 0.1511 - acc: 0.954 - ETA: 0s - loss: 0.1499 - acc: 0.954 - ETA: 0s - loss: 0.1512 - acc: 0.953 - ETA: 0s - loss: 0.1522 - acc: 0.952 - ETA: 0s - loss: 0.1519 - acc: 0.952 - ETA: 0s - loss: 0.1514 - acc: 0.953 - ETA: 0s - loss: 0.1499 - acc: 0.954 - ETA: 0s - loss: 0.1507 - acc: 0.953 - ETA: 0s - loss: 0.1514 - acc: 0.953 - 1s 208us/step - loss: 0.1531 - acc: 0.9524 - val_loss: 0.5524 - val_acc: 0.8419 Epoch 00006: val_loss did not improve Epoch 7/20 6680/6680 [==============================] - ETA: 1s - loss: 0.1023 - acc: 0.980 - ETA: 1s - loss: 0.0874 - acc: 0.980 - ETA: 1s - loss: 0.1073 - acc: 0.974 - ETA: 1s - loss: 0.0994 - acc: 0.978 - ETA: 1s - loss: 0.1012 - acc: 0.976 - ETA: 0s - loss: 0.1082 - acc: 0.972 - ETA: 0s - loss: 0.1086 - acc: 0.972 - ETA: 0s - loss: 0.1095 - acc: 0.973 - ETA: 0s - loss: 0.1075 - acc: 0.972 - ETA: 0s - loss: 0.1063 - acc: 0.972 - ETA: 0s - loss: 0.1057 - acc: 0.972 - ETA: 0s - loss: 0.1057 - acc: 0.972 - ETA: 0s - loss: 0.1062 - acc: 0.972 - ETA: 0s - loss: 0.1116 - acc: 0.970 - ETA: 0s - loss: 0.1113 - acc: 0.970 - ETA: 0s - loss: 0.1143 - acc: 0.969 - ETA: 0s - loss: 0.1173 - acc: 0.967 - ETA: 0s - loss: 0.1174 - acc: 0.967 - ETA: 0s - loss: 0.1166 - acc: 0.967 - ETA: 0s - loss: 0.1173 - acc: 0.967 - ETA: 0s - loss: 0.1201 - acc: 0.966 - ETA: 0s - loss: 0.1208 - acc: 0.965 - 1s 212us/step - loss: 0.1222 - acc: 0.9648 - val_loss: 0.5829 - val_acc: 0.8419 Epoch 00007: val_loss did not improve Epoch 8/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0568 - acc: 0.990 - ETA: 1s - loss: 0.0846 - acc: 0.982 - ETA: 1s - loss: 0.0937 - acc: 0.977 - ETA: 1s - loss: 0.0913 - acc: 0.977 - ETA: 1s - loss: 0.0869 - acc: 0.977 - ETA: 0s - loss: 0.0822 - acc: 0.980 - ETA: 0s - loss: 0.0819 - acc: 0.978 - ETA: 0s - loss: 0.0810 - acc: 0.978 - ETA: 0s - loss: 0.0834 - acc: 0.978 - ETA: 0s - loss: 0.0853 - acc: 0.977 - ETA: 0s - loss: 0.0847 - acc: 0.978 - ETA: 0s - loss: 0.0868 - acc: 0.977 - ETA: 0s - loss: 0.0861 - acc: 0.978 - ETA: 0s - loss: 0.0876 - acc: 0.977 - ETA: 0s - loss: 0.0869 - acc: 0.977 - ETA: 0s - loss: 0.0880 - acc: 0.977 - ETA: 0s - loss: 0.0877 - acc: 0.977 - ETA: 0s - loss: 0.0874 - acc: 0.977 - ETA: 0s - loss: 0.0898 - acc: 0.977 - ETA: 0s - loss: 0.0907 - acc: 0.976 - ETA: 0s - loss: 0.0913 - acc: 0.976 - ETA: 0s - loss: 0.0935 - acc: 0.975 - 1s 207us/step - loss: 0.0946 - acc: 0.9751 - val_loss: 0.6017 - val_acc: 0.8419 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0341 - acc: 1.000 - ETA: 1s - loss: 0.0453 - acc: 0.992 - ETA: 1s - loss: 0.0477 - acc: 0.991 - ETA: 1s - loss: 0.0563 - acc: 0.987 - ETA: 0s - loss: 0.0543 - acc: 0.986 - ETA: 0s - loss: 0.0617 - acc: 0.982 - ETA: 0s - loss: 0.0656 - acc: 0.981 - ETA: 0s - loss: 0.0657 - acc: 0.982 - ETA: 0s - loss: 0.0668 - acc: 0.981 - ETA: 0s - loss: 0.0679 - acc: 0.980 - ETA: 0s - loss: 0.0686 - acc: 0.980 - ETA: 0s - loss: 0.0696 - acc: 0.981 - ETA: 0s - loss: 0.0686 - acc: 0.981 - ETA: 0s - loss: 0.0682 - acc: 0.981 - ETA: 0s - loss: 0.0690 - acc: 0.980 - ETA: 0s - loss: 0.0698 - acc: 0.980 - ETA: 0s - loss: 0.0741 - acc: 0.980 - ETA: 0s - loss: 0.0753 - acc: 0.979 - ETA: 0s - loss: 0.0756 - acc: 0.979 - ETA: 0s - loss: 0.0758 - acc: 0.979 - ETA: 0s - loss: 0.0749 - acc: 0.980 - ETA: 0s - loss: 0.0755 - acc: 0.979 - 1s 202us/step - loss: 0.0767 - acc: 0.9789 - val_loss: 0.5923 - val_acc: 0.8467 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0402 - acc: 0.990 - ETA: 1s - loss: 0.0533 - acc: 0.985 - ETA: 1s - loss: 0.0493 - acc: 0.987 - ETA: 1s - loss: 0.0488 - acc: 0.988 - ETA: 1s - loss: 0.0461 - acc: 0.989 - ETA: 0s - loss: 0.0456 - acc: 0.990 - ETA: 0s - loss: 0.0474 - acc: 0.990 - ETA: 0s - loss: 0.0480 - acc: 0.990 - ETA: 0s - loss: 0.0482 - acc: 0.990 - ETA: 0s - loss: 0.0507 - acc: 0.990 - ETA: 0s - loss: 0.0517 - acc: 0.989 - ETA: 0s - loss: 0.0534 - acc: 0.988 - ETA: 0s - loss: 0.0528 - acc: 0.988 - ETA: 0s - loss: 0.0534 - acc: 0.988 - ETA: 0s - loss: 0.0532 - acc: 0.988 - ETA: 0s - loss: 0.0536 - acc: 0.988 - ETA: 0s - loss: 0.0540 - acc: 0.988 - ETA: 0s - loss: 0.0558 - acc: 0.987 - ETA: 0s - loss: 0.0567 - acc: 0.986 - ETA: 0s - loss: 0.0576 - acc: 0.986 - ETA: 0s - loss: 0.0580 - acc: 0.985 - ETA: 0s - loss: 0.0586 - acc: 0.985 - ETA: 0s - loss: 0.0609 - acc: 0.983 - 1s 214us/step - loss: 0.0610 - acc: 0.9838 - val_loss: 0.6037 - val_acc: 0.8503 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0369 - acc: 1.000 - ETA: 1s - loss: 0.0577 - acc: 0.982 - ETA: 1s - loss: 0.0492 - acc: 0.987 - ETA: 1s - loss: 0.0464 - acc: 0.990 - ETA: 1s - loss: 0.0450 - acc: 0.990 - ETA: 0s - loss: 0.0517 - acc: 0.986 - ETA: 0s - loss: 0.0535 - acc: 0.985 - ETA: 0s - loss: 0.0523 - acc: 0.985 - ETA: 0s - loss: 0.0521 - acc: 0.986 - ETA: 0s - loss: 0.0509 - acc: 0.986 - ETA: 0s - loss: 0.0524 - acc: 0.987 - ETA: 0s - loss: 0.0518 - acc: 0.987 - ETA: 0s - loss: 0.0521 - acc: 0.987 - ETA: 0s - loss: 0.0521 - acc: 0.987 - ETA: 0s - loss: 0.0522 - acc: 0.987 - ETA: 0s - loss: 0.0528 - acc: 0.987 - ETA: 0s - loss: 0.0519 - acc: 0.987 - ETA: 0s - loss: 0.0518 - acc: 0.986 - ETA: 0s - loss: 0.0517 - acc: 0.986 - ETA: 0s - loss: 0.0513 - acc: 0.987 - ETA: 0s - loss: 0.0516 - acc: 0.986 - ETA: 0s - loss: 0.0522 - acc: 0.986 - 1s 205us/step - loss: 0.0524 - acc: 0.9864 - val_loss: 0.6477 - val_acc: 0.8431 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0556 - acc: 0.980 - ETA: 1s - loss: 0.0457 - acc: 0.990 - ETA: 1s - loss: 0.0433 - acc: 0.991 - ETA: 1s - loss: 0.0467 - acc: 0.987 - ETA: 0s - loss: 0.0455 - acc: 0.988 - ETA: 0s - loss: 0.0437 - acc: 0.989 - ETA: 0s - loss: 0.0421 - acc: 0.990 - ETA: 0s - loss: 0.0408 - acc: 0.990 - ETA: 0s - loss: 0.0394 - acc: 0.991 - ETA: 0s - loss: 0.0412 - acc: 0.990 - ETA: 0s - loss: 0.0408 - acc: 0.990 - ETA: 0s - loss: 0.0419 - acc: 0.990 - ETA: 0s - loss: 0.0429 - acc: 0.989 - ETA: 0s - loss: 0.0421 - acc: 0.990 - ETA: 0s - loss: 0.0424 - acc: 0.989 - ETA: 0s - loss: 0.0420 - acc: 0.990 - ETA: 0s - loss: 0.0422 - acc: 0.990 - ETA: 0s - loss: 0.0425 - acc: 0.989 - ETA: 0s - loss: 0.0425 - acc: 0.989 - ETA: 0s - loss: 0.0425 - acc: 0.989 - ETA: 0s - loss: 0.0427 - acc: 0.989 - ETA: 0s - loss: 0.0435 - acc: 0.988 - 1s 205us/step - loss: 0.0439 - acc: 0.9886 - val_loss: 0.6362 - val_acc: 0.8491 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0299 - acc: 0.990 - ETA: 1s - loss: 0.0236 - acc: 0.997 - ETA: 1s - loss: 0.0298 - acc: 0.995 - ETA: 1s - loss: 0.0278 - acc: 0.997 - ETA: 0s - loss: 0.0276 - acc: 0.995 - ETA: 0s - loss: 0.0304 - acc: 0.995 - ETA: 0s - loss: 0.0325 - acc: 0.994 - ETA: 0s - loss: 0.0324 - acc: 0.995 - ETA: 0s - loss: 0.0326 - acc: 0.995 - ETA: 0s - loss: 0.0319 - acc: 0.995 - ETA: 0s - loss: 0.0309 - acc: 0.995 - ETA: 0s - loss: 0.0303 - acc: 0.996 - ETA: 0s - loss: 0.0303 - acc: 0.995 - ETA: 0s - loss: 0.0309 - acc: 0.995 - ETA: 0s - loss: 0.0318 - acc: 0.994 - ETA: 0s - loss: 0.0331 - acc: 0.993 - ETA: 0s - loss: 0.0328 - acc: 0.994 - ETA: 0s - loss: 0.0341 - acc: 0.993 - ETA: 0s - loss: 0.0342 - acc: 0.993 - ETA: 0s - loss: 0.0346 - acc: 0.993 - ETA: 0s - loss: 0.0361 - acc: 0.992 - ETA: 0s - loss: 0.0369 - acc: 0.992 - 1s 203us/step - loss: 0.0368 - acc: 0.9921 - val_loss: 0.6311 - val_acc: 0.8539 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0250 - acc: 0.990 - ETA: 1s - loss: 0.0377 - acc: 0.990 - ETA: 1s - loss: 0.0290 - acc: 0.992 - ETA: 1s - loss: 0.0250 - acc: 0.995 - ETA: 1s - loss: 0.0230 - acc: 0.995 - ETA: 0s - loss: 0.0226 - acc: 0.995 - ETA: 0s - loss: 0.0233 - acc: 0.994 - ETA: 0s - loss: 0.0272 - acc: 0.994 - ETA: 0s - loss: 0.0288 - acc: 0.994 - ETA: 0s - loss: 0.0297 - acc: 0.994 - ETA: 0s - loss: 0.0286 - acc: 0.994 - ETA: 0s - loss: 0.0285 - acc: 0.994 - ETA: 0s - loss: 0.0288 - acc: 0.994 - ETA: 0s - loss: 0.0288 - acc: 0.994 - ETA: 0s - loss: 0.0284 - acc: 0.994 - ETA: 0s - loss: 0.0281 - acc: 0.995 - ETA: 0s - loss: 0.0284 - acc: 0.995 - ETA: 0s - loss: 0.0284 - acc: 0.995 - ETA: 0s - loss: 0.0288 - acc: 0.994 - ETA: 0s - loss: 0.0287 - acc: 0.994 - ETA: 0s - loss: 0.0283 - acc: 0.994 - ETA: 0s - loss: 0.0286 - acc: 0.994 - 1s 204us/step - loss: 0.0303 - acc: 0.9943 - val_loss: 0.7064 - val_acc: 0.8455 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0258 - acc: 1.000 - ETA: 1s - loss: 0.0289 - acc: 0.997 - ETA: 1s - loss: 0.0335 - acc: 0.994 - ETA: 1s - loss: 0.0292 - acc: 0.995 - ETA: 1s - loss: 0.0259 - acc: 0.995 - ETA: 0s - loss: 0.0253 - acc: 0.995 - ETA: 0s - loss: 0.0260 - acc: 0.994 - ETA: 0s - loss: 0.0241 - acc: 0.995 - ETA: 0s - loss: 0.0246 - acc: 0.995 - ETA: 0s - loss: 0.0251 - acc: 0.994 - ETA: 0s - loss: 0.0242 - acc: 0.994 - ETA: 0s - loss: 0.0250 - acc: 0.994 - ETA: 0s - loss: 0.0254 - acc: 0.994 - ETA: 0s - loss: 0.0250 - acc: 0.994 - ETA: 0s - loss: 0.0251 - acc: 0.994 - ETA: 0s - loss: 0.0249 - acc: 0.994 - ETA: 0s - loss: 0.0261 - acc: 0.993 - ETA: 0s - loss: 0.0266 - acc: 0.993 - ETA: 0s - loss: 0.0267 - acc: 0.993 - ETA: 0s - loss: 0.0265 - acc: 0.992 - ETA: 0s - loss: 0.0262 - acc: 0.993 - ETA: 0s - loss: 0.0264 - acc: 0.993 - 1s 205us/step - loss: 0.0265 - acc: 0.9933 - val_loss: 0.6640 - val_acc: 0.8515 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0119 - acc: 1.000 - ETA: 1s - loss: 0.0196 - acc: 0.997 - ETA: 1s - loss: 0.0165 - acc: 0.998 - ETA: 1s - loss: 0.0197 - acc: 0.997 - ETA: 0s - loss: 0.0180 - acc: 0.997 - ETA: 0s - loss: 0.0161 - acc: 0.998 - ETA: 0s - loss: 0.0156 - acc: 0.998 - ETA: 0s - loss: 0.0160 - acc: 0.997 - ETA: 0s - loss: 0.0167 - acc: 0.997 - ETA: 0s - loss: 0.0170 - acc: 0.997 - ETA: 0s - loss: 0.0165 - acc: 0.997 - ETA: 0s - loss: 0.0164 - acc: 0.997 - ETA: 0s - loss: 0.0176 - acc: 0.997 - ETA: 0s - loss: 0.0179 - acc: 0.997 - ETA: 0s - loss: 0.0193 - acc: 0.996 - ETA: 0s - loss: 0.0196 - acc: 0.996 - ETA: 0s - loss: 0.0210 - acc: 0.995 - ETA: 0s - loss: 0.0215 - acc: 0.994 - ETA: 0s - loss: 0.0212 - acc: 0.994 - ETA: 0s - loss: 0.0215 - acc: 0.994 - ETA: 0s - loss: 0.0211 - acc: 0.995 - ETA: 0s - loss: 0.0222 - acc: 0.994 - 1s 203us/step - loss: 0.0227 - acc: 0.9943 - val_loss: 0.6789 - val_acc: 0.8575 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0140 - acc: 1.000 - ETA: 1s - loss: 0.0101 - acc: 1.000 - ETA: 1s - loss: 0.0140 - acc: 0.998 - ETA: 1s - loss: 0.0157 - acc: 0.996 - ETA: 0s - loss: 0.0170 - acc: 0.995 - ETA: 0s - loss: 0.0156 - acc: 0.996 - ETA: 0s - loss: 0.0180 - acc: 0.994 - ETA: 0s - loss: 0.0177 - acc: 0.995 - ETA: 0s - loss: 0.0168 - acc: 0.995 - ETA: 0s - loss: 0.0189 - acc: 0.994 - ETA: 0s - loss: 0.0191 - acc: 0.994 - ETA: 0s - loss: 0.0191 - acc: 0.994 - ETA: 0s - loss: 0.0188 - acc: 0.994 - ETA: 0s - loss: 0.0191 - acc: 0.994 - ETA: 0s - loss: 0.0190 - acc: 0.994 - ETA: 0s - loss: 0.0194 - acc: 0.994 - ETA: 0s - loss: 0.0210 - acc: 0.994 - ETA: 0s - loss: 0.0205 - acc: 0.994 - ETA: 0s - loss: 0.0205 - acc: 0.994 - ETA: 0s - loss: 0.0200 - acc: 0.994 - ETA: 0s - loss: 0.0201 - acc: 0.994 - ETA: 0s - loss: 0.0203 - acc: 0.994 - 1s 204us/step - loss: 0.0202 - acc: 0.9946 - val_loss: 0.6767 - val_acc: 0.8575 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0138 - acc: 1.000 - ETA: 1s - loss: 0.0158 - acc: 0.997 - ETA: 1s - loss: 0.0172 - acc: 0.997 - ETA: 1s - loss: 0.0154 - acc: 0.998 - ETA: 0s - loss: 0.0148 - acc: 0.997 - ETA: 0s - loss: 0.0151 - acc: 0.996 - ETA: 0s - loss: 0.0173 - acc: 0.995 - ETA: 0s - loss: 0.0165 - acc: 0.996 - ETA: 0s - loss: 0.0180 - acc: 0.995 - ETA: 0s - loss: 0.0174 - acc: 0.996 - ETA: 0s - loss: 0.0170 - acc: 0.996 - ETA: 0s - loss: 0.0170 - acc: 0.996 - ETA: 0s - loss: 0.0167 - acc: 0.996 - ETA: 0s - loss: 0.0164 - acc: 0.996 - ETA: 0s - loss: 0.0158 - acc: 0.997 - ETA: 0s - loss: 0.0154 - acc: 0.997 - ETA: 0s - loss: 0.0160 - acc: 0.996 - ETA: 0s - loss: 0.0162 - acc: 0.996 - ETA: 0s - loss: 0.0162 - acc: 0.997 - ETA: 0s - loss: 0.0168 - acc: 0.997 - ETA: 0s - loss: 0.0169 - acc: 0.996 - ETA: 0s - loss: 0.0171 - acc: 0.996 - 1s 205us/step - loss: 0.0169 - acc: 0.9969 - val_loss: 0.6912 - val_acc: 0.8467 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0149 - acc: 1.000 - ETA: 1s - loss: 0.0096 - acc: 1.000 - ETA: 1s - loss: 0.0146 - acc: 0.997 - ETA: 1s - loss: 0.0123 - acc: 0.998 - ETA: 0s - loss: 0.0132 - acc: 0.996 - ETA: 0s - loss: 0.0123 - acc: 0.996 - ETA: 0s - loss: 0.0125 - acc: 0.996 - ETA: 0s - loss: 0.0118 - acc: 0.997 - ETA: 0s - loss: 0.0130 - acc: 0.997 - ETA: 0s - loss: 0.0126 - acc: 0.997 - ETA: 0s - loss: 0.0138 - acc: 0.997 - ETA: 0s - loss: 0.0137 - acc: 0.997 - ETA: 0s - loss: 0.0135 - acc: 0.997 - ETA: 0s - loss: 0.0133 - acc: 0.997 - ETA: 0s - loss: 0.0129 - acc: 0.997 - ETA: 0s - loss: 0.0130 - acc: 0.997 - ETA: 0s - loss: 0.0129 - acc: 0.997 - ETA: 0s - loss: 0.0131 - acc: 0.997 - ETA: 0s - loss: 0.0136 - acc: 0.997 - ETA: 0s - loss: 0.0137 - acc: 0.997 - ETA: 0s - loss: 0.0144 - acc: 0.996 - ETA: 0s - loss: 0.0142 - acc: 0.997 - 1s 204us/step - loss: 0.0146 - acc: 0.9967 - val_loss: 0.6927 - val_acc: 0.8563 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0105 - acc: 1.000 - ETA: 1s - loss: 0.0088 - acc: 0.997 - ETA: 1s - loss: 0.0198 - acc: 0.994 - ETA: 1s - loss: 0.0156 - acc: 0.996 - ETA: 1s - loss: 0.0142 - acc: 0.996 - ETA: 0s - loss: 0.0148 - acc: 0.995 - ETA: 0s - loss: 0.0145 - acc: 0.995 - ETA: 0s - loss: 0.0139 - acc: 0.995 - ETA: 0s - loss: 0.0150 - acc: 0.996 - ETA: 0s - loss: 0.0141 - acc: 0.996 - ETA: 0s - loss: 0.0143 - acc: 0.996 - ETA: 0s - loss: 0.0147 - acc: 0.996 - ETA: 0s - loss: 0.0143 - acc: 0.996 - ETA: 0s - loss: 0.0145 - acc: 0.996 - ETA: 0s - loss: 0.0141 - acc: 0.996 - ETA: 0s - loss: 0.0140 - acc: 0.996 - ETA: 0s - loss: 0.0138 - acc: 0.996 - ETA: 0s - loss: 0.0140 - acc: 0.996 - ETA: 0s - loss: 0.0138 - acc: 0.996 - ETA: 0s - loss: 0.0136 - acc: 0.997 - ETA: 0s - loss: 0.0135 - acc: 0.997 - ETA: 0s - loss: 0.0133 - acc: 0.997 - 1s 206us/step - loss: 0.0133 - acc: 0.9973 - val_loss: 0.7685 - val_acc: 0.8503 Epoch 00020: val_loss did not improve we are at InceptionV3_model3 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 41s - loss: 8.6767 - acc: 0.0000e+ - ETA: 10s - loss: 11.4503 - acc: 0.0250 - ETA: 6s - loss: 11.4699 - acc: 0.047 - ETA: 4s - loss: 11.1290 - acc: 0.08 - ETA: 3s - loss: 10.7604 - acc: 0.11 - ETA: 2s - loss: 10.4339 - acc: 0.14 - ETA: 2s - loss: 10.1898 - acc: 0.15 - ETA: 2s - loss: 9.9995 - acc: 0.1714 - ETA: 1s - loss: 9.7945 - acc: 0.187 - ETA: 1s - loss: 9.7084 - acc: 0.200 - ETA: 1s - loss: 9.5541 - acc: 0.213 - ETA: 1s - loss: 9.3904 - acc: 0.230 - ETA: 1s - loss: 9.3277 - acc: 0.241 - ETA: 0s - loss: 9.2489 - acc: 0.251 - ETA: 0s - loss: 9.1847 - acc: 0.259 - ETA: 0s - loss: 9.1452 - acc: 0.268 - ETA: 0s - loss: 9.0724 - acc: 0.276 - ETA: 0s - loss: 8.9798 - acc: 0.284 - ETA: 0s - loss: 8.9424 - acc: 0.288 - ETA: 0s - loss: 8.8798 - acc: 0.296 - ETA: 0s - loss: 8.8256 - acc: 0.302 - ETA: 0s - loss: 8.7856 - acc: 0.308 - 2s 315us/step - loss: 8.7142 - acc: 0.3135 - val_loss: 7.9707 - val_acc: 0.4251 Epoch 00001: val_loss improved from inf to 7.97073, saving model to saved_models/weights.best.InceptionV33.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 1s - loss: 7.9304 - acc: 0.420 - ETA: 1s - loss: 7.8372 - acc: 0.445 - ETA: 1s - loss: 7.7233 - acc: 0.445 - ETA: 1s - loss: 7.7021 - acc: 0.443 - ETA: 1s - loss: 7.7252 - acc: 0.445 - ETA: 0s - loss: 7.5498 - acc: 0.453 - ETA: 0s - loss: 7.5217 - acc: 0.453 - ETA: 0s - loss: 7.5611 - acc: 0.454 - ETA: 0s - loss: 7.5025 - acc: 0.457 - ETA: 0s - loss: 7.4357 - acc: 0.463 - ETA: 0s - loss: 7.3812 - acc: 0.467 - ETA: 0s - loss: 7.4479 - acc: 0.463 - ETA: 0s - loss: 7.5329 - acc: 0.457 - ETA: 0s - loss: 7.5892 - acc: 0.455 - ETA: 0s - loss: 7.6214 - acc: 0.452 - ETA: 0s - loss: 7.6329 - acc: 0.451 - ETA: 0s - loss: 7.6495 - acc: 0.452 - ETA: 0s - loss: 7.6279 - acc: 0.454 - ETA: 0s - loss: 7.6471 - acc: 0.453 - ETA: 0s - loss: 7.6264 - acc: 0.456 - ETA: 0s - loss: 7.6571 - acc: 0.455 - ETA: 0s - loss: 7.6627 - acc: 0.455 - 1s 202us/step - loss: 7.6865 - acc: 0.4537 - val_loss: 8.0334 - val_acc: 0.4132 Epoch 00002: val_loss did not improve Epoch 3/20 6680/6680 [==============================] - ETA: 1s - loss: 7.5109 - acc: 0.430 - ETA: 1s - loss: 7.7980 - acc: 0.455 - ETA: 1s - loss: 8.1429 - acc: 0.442 - ETA: 1s - loss: 8.0532 - acc: 0.450 - ETA: 0s - loss: 7.8084 - acc: 0.466 - ETA: 0s - loss: 7.8704 - acc: 0.466 - ETA: 0s - loss: 7.7612 - acc: 0.473 - ETA: 0s - loss: 7.8942 - acc: 0.465 - ETA: 0s - loss: 7.9152 - acc: 0.464 - ETA: 0s - loss: 7.7966 - acc: 0.471 - ETA: 0s - loss: 7.7769 - acc: 0.471 - ETA: 0s - loss: 7.8221 - acc: 0.466 - ETA: 0s - loss: 7.8002 - acc: 0.466 - ETA: 0s - loss: 7.7860 - acc: 0.469 - ETA: 0s - loss: 7.7559 - acc: 0.472 - ETA: 0s - loss: 7.7280 - acc: 0.472 - ETA: 0s - loss: 7.6814 - acc: 0.475 - ETA: 0s - loss: 7.6541 - acc: 0.475 - ETA: 0s - loss: 7.6251 - acc: 0.477 - ETA: 0s - loss: 7.5735 - acc: 0.480 - ETA: 0s - loss: 7.5692 - acc: 0.481 - ETA: 0s - loss: 7.5632 - acc: 0.479 - 1s 204us/step - loss: 7.5586 - acc: 0.4799 - val_loss: 7.7813 - val_acc: 0.4647 Epoch 00003: val_loss improved from 7.97073 to 7.78127, saving model to saved_models/weights.best.InceptionV33.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 1s - loss: 9.2676 - acc: 0.410 - ETA: 1s - loss: 8.1748 - acc: 0.462 - ETA: 1s - loss: 7.9403 - acc: 0.472 - ETA: 1s - loss: 7.7503 - acc: 0.490 - ETA: 0s - loss: 7.5040 - acc: 0.500 - ETA: 0s - loss: 7.5860 - acc: 0.495 - ETA: 0s - loss: 7.6492 - acc: 0.492 - ETA: 0s - loss: 7.5248 - acc: 0.497 - ETA: 0s - loss: 7.4513 - acc: 0.497 - ETA: 0s - loss: 7.4061 - acc: 0.500 - ETA: 0s - loss: 7.3940 - acc: 0.499 - ETA: 0s - loss: 7.3670 - acc: 0.500 - ETA: 0s - loss: 7.3674 - acc: 0.500 - ETA: 0s - loss: 7.4251 - acc: 0.497 - ETA: 0s - loss: 7.5003 - acc: 0.492 - ETA: 0s - loss: 7.4907 - acc: 0.493 - ETA: 0s - loss: 7.4471 - acc: 0.496 - ETA: 0s - loss: 7.4469 - acc: 0.496 - ETA: 0s - loss: 7.4622 - acc: 0.495 - ETA: 0s - loss: 7.4782 - acc: 0.494 - ETA: 0s - loss: 7.4747 - acc: 0.495 - ETA: 0s - loss: 7.4773 - acc: 0.496 - 1s 206us/step - loss: 7.5025 - acc: 0.4954 - val_loss: 7.8193 - val_acc: 0.4671 Epoch 00004: val_loss did not improve Epoch 5/20 6680/6680 [==============================] - ETA: 1s - loss: 8.7615 - acc: 0.440 - ETA: 1s - loss: 7.3408 - acc: 0.520 - ETA: 1s - loss: 7.4806 - acc: 0.510 - ETA: 1s - loss: 7.3566 - acc: 0.516 - ETA: 1s - loss: 7.2966 - acc: 0.518 - ETA: 0s - loss: 7.2781 - acc: 0.521 - ETA: 0s - loss: 7.2982 - acc: 0.519 - ETA: 0s - loss: 7.2001 - acc: 0.524 - ETA: 0s - loss: 7.2351 - acc: 0.519 - ETA: 0s - loss: 7.2630 - acc: 0.517 - ETA: 0s - loss: 7.2653 - acc: 0.519 - ETA: 0s - loss: 7.2770 - acc: 0.517 - ETA: 0s - loss: 7.3039 - acc: 0.516 - ETA: 0s - loss: 7.3526 - acc: 0.513 - ETA: 0s - loss: 7.3889 - acc: 0.511 - ETA: 0s - loss: 7.3857 - acc: 0.511 - ETA: 0s - loss: 7.4043 - acc: 0.510 - ETA: 0s - loss: 7.3803 - acc: 0.512 - ETA: 0s - loss: 7.3943 - acc: 0.511 - ETA: 0s - loss: 7.4326 - acc: 0.509 - ETA: 0s - loss: 7.4185 - acc: 0.510 - ETA: 0s - loss: 7.4435 - acc: 0.508 - 1s 203us/step - loss: 7.4448 - acc: 0.5090 - val_loss: 7.7409 - val_acc: 0.4814 Epoch 00005: val_loss improved from 7.78127 to 7.74093, saving model to saved_models/weights.best.InceptionV33.hdf5 Epoch 6/20 6680/6680 [==============================] - ETA: 1s - loss: 8.4230 - acc: 0.460 - ETA: 1s - loss: 7.7024 - acc: 0.492 - ETA: 1s - loss: 7.5224 - acc: 0.511 - ETA: 1s - loss: 7.6480 - acc: 0.509 - ETA: 0s - loss: 7.5817 - acc: 0.513 - ETA: 0s - loss: 7.5674 - acc: 0.515 - ETA: 0s - loss: 7.5675 - acc: 0.512 - ETA: 0s - loss: 7.5310 - acc: 0.512 - ETA: 0s - loss: 7.4821 - acc: 0.513 - ETA: 0s - loss: 7.4792 - acc: 0.513 - ETA: 0s - loss: 7.4522 - acc: 0.514 - ETA: 0s - loss: 7.4844 - acc: 0.513 - ETA: 0s - loss: 7.4577 - acc: 0.514 - ETA: 0s - loss: 7.5203 - acc: 0.509 - ETA: 0s - loss: 7.5418 - acc: 0.507 - ETA: 0s - loss: 7.5258 - acc: 0.509 - ETA: 0s - loss: 7.4990 - acc: 0.510 - ETA: 0s - loss: 7.4639 - acc: 0.513 - ETA: 0s - loss: 7.4325 - acc: 0.515 - ETA: 0s - loss: 7.4362 - acc: 0.514 - ETA: 0s - loss: 7.4473 - acc: 0.513 - ETA: 0s - loss: 7.4281 - acc: 0.515 - 1s 201us/step - loss: 7.4109 - acc: 0.5165 - val_loss: 7.7752 - val_acc: 0.4754 Epoch 00006: val_loss did not improve Epoch 7/20 6680/6680 [==============================] - ETA: 1s - loss: 6.5424 - acc: 0.580 - ETA: 1s - loss: 6.5964 - acc: 0.572 - ETA: 1s - loss: 6.8033 - acc: 0.562 - ETA: 1s - loss: 7.1107 - acc: 0.541 - ETA: 0s - loss: 7.0064 - acc: 0.546 - ETA: 0s - loss: 7.1449 - acc: 0.538 - ETA: 0s - loss: 7.2270 - acc: 0.534 - ETA: 0s - loss: 7.3129 - acc: 0.525 - ETA: 0s - loss: 7.3094 - acc: 0.527 - ETA: 0s - loss: 7.3484 - acc: 0.525 - ETA: 0s - loss: 7.3621 - acc: 0.524 - ETA: 0s - loss: 7.3311 - acc: 0.524 - ETA: 0s - loss: 7.3569 - acc: 0.522 - ETA: 0s - loss: 7.3430 - acc: 0.523 - ETA: 0s - loss: 7.3366 - acc: 0.524 - ETA: 0s - loss: 7.3010 - acc: 0.527 - ETA: 0s - loss: 7.3103 - acc: 0.526 - ETA: 0s - loss: 7.3284 - acc: 0.526 - ETA: 0s - loss: 7.2998 - acc: 0.528 - ETA: 0s - loss: 7.3692 - acc: 0.524 - ETA: 0s - loss: 7.3636 - acc: 0.525 - ETA: 0s - loss: 7.3595 - acc: 0.525 - 1s 201us/step - loss: 7.3878 - acc: 0.5238 - val_loss: 7.7636 - val_acc: 0.4862 Epoch 00007: val_loss did not improve Epoch 8/20 6680/6680 [==============================] - ETA: 1s - loss: 7.9291 - acc: 0.500 - ETA: 1s - loss: 7.5768 - acc: 0.517 - ETA: 1s - loss: 7.4199 - acc: 0.527 - ETA: 1s - loss: 7.4722 - acc: 0.520 - ETA: 0s - loss: 7.2751 - acc: 0.536 - ETA: 0s - loss: 7.3446 - acc: 0.533 - ETA: 0s - loss: 7.3169 - acc: 0.531 - ETA: 0s - loss: 7.3391 - acc: 0.529 - ETA: 0s - loss: 7.2582 - acc: 0.532 - ETA: 0s - loss: 7.3378 - acc: 0.527 - ETA: 0s - loss: 7.2768 - acc: 0.532 - ETA: 0s - loss: 7.2563 - acc: 0.533 - ETA: 0s - loss: 7.3284 - acc: 0.528 - ETA: 0s - loss: 7.3417 - acc: 0.528 - ETA: 0s - loss: 7.3699 - acc: 0.525 - ETA: 0s - loss: 7.3436 - acc: 0.527 - ETA: 0s - loss: 7.3520 - acc: 0.526 - ETA: 0s - loss: 7.3543 - acc: 0.526 - ETA: 0s - loss: 7.3657 - acc: 0.526 - ETA: 0s - loss: 7.3491 - acc: 0.527 - ETA: 0s - loss: 7.3836 - acc: 0.524 - ETA: 0s - loss: 7.3880 - acc: 0.524 - 1s 202us/step - loss: 7.3937 - acc: 0.5237 - val_loss: 7.8136 - val_acc: 0.4647 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 1s - loss: 7.4940 - acc: 0.500 - ETA: 1s - loss: 7.6025 - acc: 0.510 - ETA: 1s - loss: 7.4891 - acc: 0.512 - ETA: 1s - loss: 7.2945 - acc: 0.530 - ETA: 1s - loss: 7.2039 - acc: 0.537 - ETA: 0s - loss: 7.2052 - acc: 0.540 - ETA: 0s - loss: 7.2263 - acc: 0.538 - ETA: 0s - loss: 7.2285 - acc: 0.538 - ETA: 0s - loss: 7.2256 - acc: 0.538 - ETA: 0s - loss: 7.2786 - acc: 0.535 - ETA: 0s - loss: 7.3548 - acc: 0.532 - ETA: 0s - loss: 7.3825 - acc: 0.530 - ETA: 0s - loss: 7.3351 - acc: 0.532 - ETA: 0s - loss: 7.3415 - acc: 0.532 - ETA: 0s - loss: 7.3493 - acc: 0.532 - ETA: 0s - loss: 7.3857 - acc: 0.530 - ETA: 0s - loss: 7.3706 - acc: 0.531 - ETA: 0s - loss: 7.3457 - acc: 0.532 - ETA: 0s - loss: 7.3508 - acc: 0.531 - ETA: 0s - loss: 7.3422 - acc: 0.532 - ETA: 0s - loss: 7.3469 - acc: 0.532 - ETA: 0s - loss: 7.3438 - acc: 0.532 - 1s 207us/step - loss: 7.3665 - acc: 0.5310 - val_loss: 7.8224 - val_acc: 0.4754 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 1s - loss: 6.6250 - acc: 0.580 - ETA: 1s - loss: 7.3353 - acc: 0.535 - ETA: 1s - loss: 6.9708 - acc: 0.555 - ETA: 1s - loss: 7.2070 - acc: 0.543 - ETA: 1s - loss: 7.4678 - acc: 0.528 - ETA: 0s - loss: 7.2645 - acc: 0.540 - ETA: 0s - loss: 7.2228 - acc: 0.543 - ETA: 0s - loss: 7.2151 - acc: 0.543 - ETA: 0s - loss: 7.2548 - acc: 0.540 - ETA: 0s - loss: 7.2972 - acc: 0.537 - ETA: 0s - loss: 7.2725 - acc: 0.540 - ETA: 0s - loss: 7.3311 - acc: 0.535 - ETA: 0s - loss: 7.2945 - acc: 0.537 - ETA: 0s - loss: 7.3413 - acc: 0.534 - ETA: 0s - loss: 7.2929 - acc: 0.537 - ETA: 0s - loss: 7.2988 - acc: 0.536 - ETA: 0s - loss: 7.3398 - acc: 0.533 - ETA: 0s - loss: 7.3403 - acc: 0.533 - ETA: 0s - loss: 7.3316 - acc: 0.534 - ETA: 0s - loss: 7.3560 - acc: 0.532 - ETA: 0s - loss: 7.3608 - acc: 0.532 - ETA: 0s - loss: 7.3563 - acc: 0.533 - 1s 203us/step - loss: 7.3594 - acc: 0.5335 - val_loss: 7.7420 - val_acc: 0.4946 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 1s - loss: 6.4548 - acc: 0.600 - ETA: 1s - loss: 6.6924 - acc: 0.585 - ETA: 1s - loss: 7.0724 - acc: 0.561 - ETA: 1s - loss: 7.1457 - acc: 0.555 - ETA: 0s - loss: 7.0404 - acc: 0.560 - ETA: 0s - loss: 7.2047 - acc: 0.549 - ETA: 0s - loss: 7.2251 - acc: 0.546 - ETA: 0s - loss: 7.2180 - acc: 0.547 - ETA: 0s - loss: 7.2920 - acc: 0.542 - ETA: 0s - loss: 7.2384 - acc: 0.545 - ETA: 0s - loss: 7.3104 - acc: 0.540 - ETA: 0s - loss: 7.3059 - acc: 0.540 - ETA: 0s - loss: 7.2719 - acc: 0.542 - ETA: 0s - loss: 7.3042 - acc: 0.540 - ETA: 0s - loss: 7.2753 - acc: 0.542 - ETA: 0s - loss: 7.3323 - acc: 0.538 - ETA: 0s - loss: 7.3598 - acc: 0.535 - ETA: 0s - loss: 7.3467 - acc: 0.536 - ETA: 0s - loss: 7.3455 - acc: 0.536 - ETA: 0s - loss: 7.3284 - acc: 0.537 - ETA: 0s - loss: 7.3426 - acc: 0.536 - ETA: 0s - loss: 7.3340 - acc: 0.537 - 1s 200us/step - loss: 7.3504 - acc: 0.5365 - val_loss: 7.7897 - val_acc: 0.4874 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 1s - loss: 5.6438 - acc: 0.650 - ETA: 1s - loss: 7.0472 - acc: 0.555 - ETA: 1s - loss: 7.2474 - acc: 0.542 - ETA: 1s - loss: 7.5345 - acc: 0.525 - ETA: 0s - loss: 7.4709 - acc: 0.530 - ETA: 0s - loss: 7.5131 - acc: 0.528 - ETA: 0s - loss: 7.4331 - acc: 0.532 - ETA: 0s - loss: 7.4087 - acc: 0.534 - ETA: 0s - loss: 7.4430 - acc: 0.532 - ETA: 0s - loss: 7.4312 - acc: 0.533 - ETA: 0s - loss: 7.4142 - acc: 0.534 - ETA: 0s - loss: 7.3940 - acc: 0.535 - ETA: 0s - loss: 7.3375 - acc: 0.538 - ETA: 0s - loss: 7.3389 - acc: 0.537 - ETA: 0s - loss: 7.3415 - acc: 0.537 - ETA: 0s - loss: 7.3377 - acc: 0.537 - ETA: 0s - loss: 7.3225 - acc: 0.537 - ETA: 0s - loss: 7.3331 - acc: 0.537 - ETA: 0s - loss: 7.3616 - acc: 0.535 - ETA: 0s - loss: 7.3589 - acc: 0.536 - ETA: 0s - loss: 7.3645 - acc: 0.536 - ETA: 0s - loss: 7.3468 - acc: 0.537 - 1s 201us/step - loss: 7.3504 - acc: 0.5377 - val_loss: 7.7605 - val_acc: 0.4874 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 1s - loss: 7.0929 - acc: 0.560 - ETA: 1s - loss: 7.2591 - acc: 0.547 - ETA: 1s - loss: 7.6050 - acc: 0.525 - ETA: 1s - loss: 7.5558 - acc: 0.526 - ETA: 0s - loss: 7.5522 - acc: 0.526 - ETA: 0s - loss: 7.3669 - acc: 0.538 - ETA: 0s - loss: 7.3519 - acc: 0.539 - ETA: 0s - loss: 7.3957 - acc: 0.535 - ETA: 0s - loss: 7.3569 - acc: 0.537 - ETA: 0s - loss: 7.3701 - acc: 0.535 - ETA: 0s - loss: 7.3452 - acc: 0.536 - ETA: 0s - loss: 7.3341 - acc: 0.537 - ETA: 0s - loss: 7.2930 - acc: 0.540 - ETA: 0s - loss: 7.2788 - acc: 0.541 - ETA: 0s - loss: 7.2359 - acc: 0.544 - ETA: 0s - loss: 7.3036 - acc: 0.541 - ETA: 0s - loss: 7.3407 - acc: 0.538 - ETA: 0s - loss: 7.3019 - acc: 0.541 - ETA: 0s - loss: 7.3231 - acc: 0.539 - ETA: 0s - loss: 7.3643 - acc: 0.537 - ETA: 0s - loss: 7.3544 - acc: 0.538 - ETA: 0s - loss: 7.3727 - acc: 0.536 - 1s 202us/step - loss: 7.3452 - acc: 0.5379 - val_loss: 7.7869 - val_acc: 0.4838 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 1s - loss: 7.4162 - acc: 0.540 - ETA: 1s - loss: 6.9145 - acc: 0.565 - ETA: 1s - loss: 6.9379 - acc: 0.560 - ETA: 1s - loss: 7.4197 - acc: 0.533 - ETA: 0s - loss: 7.4204 - acc: 0.533 - ETA: 0s - loss: 7.4168 - acc: 0.533 - ETA: 0s - loss: 7.3879 - acc: 0.535 - ETA: 0s - loss: 7.3636 - acc: 0.537 - ETA: 0s - loss: 7.3717 - acc: 0.537 - ETA: 0s - loss: 7.3424 - acc: 0.539 - ETA: 0s - loss: 7.3056 - acc: 0.541 - ETA: 0s - loss: 7.3181 - acc: 0.541 - ETA: 0s - loss: 7.2885 - acc: 0.543 - ETA: 0s - loss: 7.2352 - acc: 0.546 - ETA: 0s - loss: 7.3377 - acc: 0.540 - ETA: 0s - loss: 7.3547 - acc: 0.538 - ETA: 0s - loss: 7.3452 - acc: 0.539 - ETA: 0s - loss: 7.3493 - acc: 0.539 - ETA: 0s - loss: 7.3325 - acc: 0.541 - ETA: 0s - loss: 7.3148 - acc: 0.542 - ETA: 0s - loss: 7.3391 - acc: 0.540 - ETA: 0s - loss: 7.3403 - acc: 0.539 - 1s 201us/step - loss: 7.3468 - acc: 0.5389 - val_loss: 7.7790 - val_acc: 0.4898 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 1s - loss: 7.7383 - acc: 0.520 - ETA: 1s - loss: 7.4169 - acc: 0.540 - ETA: 1s - loss: 7.1463 - acc: 0.555 - ETA: 1s - loss: 7.0245 - acc: 0.563 - ETA: 0s - loss: 7.0558 - acc: 0.560 - ETA: 0s - loss: 7.0647 - acc: 0.559 - ETA: 0s - loss: 7.0947 - acc: 0.557 - ETA: 0s - loss: 7.1398 - acc: 0.554 - ETA: 0s - loss: 7.1679 - acc: 0.552 - ETA: 0s - loss: 7.2244 - acc: 0.548 - ETA: 0s - loss: 7.3271 - acc: 0.541 - ETA: 0s - loss: 7.3350 - acc: 0.541 - ETA: 0s - loss: 7.3898 - acc: 0.538 - ETA: 0s - loss: 7.3837 - acc: 0.539 - ETA: 0s - loss: 7.3828 - acc: 0.539 - ETA: 0s - loss: 7.3165 - acc: 0.543 - ETA: 0s - loss: 7.3490 - acc: 0.540 - ETA: 0s - loss: 7.3410 - acc: 0.541 - ETA: 0s - loss: 7.3278 - acc: 0.542 - ETA: 0s - loss: 7.3584 - acc: 0.540 - ETA: 0s - loss: 7.3339 - acc: 0.542 - ETA: 0s - loss: 7.3364 - acc: 0.541 - 1s 203us/step - loss: 7.3391 - acc: 0.5412 - val_loss: 7.8615 - val_acc: 0.4766 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 1s - loss: 8.5431 - acc: 0.470 - ETA: 1s - loss: 8.1802 - acc: 0.487 - ETA: 1s - loss: 7.6484 - acc: 0.521 - ETA: 1s - loss: 7.3570 - acc: 0.539 - ETA: 1s - loss: 7.3461 - acc: 0.540 - ETA: 0s - loss: 7.3110 - acc: 0.542 - ETA: 0s - loss: 7.3191 - acc: 0.542 - ETA: 0s - loss: 7.3071 - acc: 0.542 - ETA: 0s - loss: 7.3540 - acc: 0.539 - ETA: 0s - loss: 7.2756 - acc: 0.544 - ETA: 0s - loss: 7.3052 - acc: 0.542 - ETA: 0s - loss: 7.2981 - acc: 0.542 - ETA: 0s - loss: 7.3527 - acc: 0.538 - ETA: 0s - loss: 7.3623 - acc: 0.537 - ETA: 0s - loss: 7.3210 - acc: 0.540 - ETA: 0s - loss: 7.3166 - acc: 0.541 - ETA: 0s - loss: 7.2777 - acc: 0.543 - ETA: 0s - loss: 7.2701 - acc: 0.544 - ETA: 0s - loss: 7.3165 - acc: 0.541 - ETA: 0s - loss: 7.3272 - acc: 0.541 - ETA: 0s - loss: 7.3291 - acc: 0.541 - ETA: 0s - loss: 7.3509 - acc: 0.540 - 1s 203us/step - loss: 7.3379 - acc: 0.5409 - val_loss: 7.8315 - val_acc: 0.4802 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 1s - loss: 7.5760 - acc: 0.530 - ETA: 1s - loss: 7.5874 - acc: 0.527 - ETA: 1s - loss: 7.3527 - acc: 0.542 - ETA: 1s - loss: 7.3883 - acc: 0.541 - ETA: 0s - loss: 7.4074 - acc: 0.540 - ETA: 0s - loss: 7.3723 - acc: 0.541 - ETA: 0s - loss: 7.3830 - acc: 0.538 - ETA: 0s - loss: 7.3962 - acc: 0.537 - ETA: 0s - loss: 7.3223 - acc: 0.542 - ETA: 0s - loss: 7.3408 - acc: 0.540 - ETA: 0s - loss: 7.3413 - acc: 0.540 - ETA: 0s - loss: 7.4138 - acc: 0.535 - ETA: 0s - loss: 7.3834 - acc: 0.537 - ETA: 0s - loss: 7.3296 - acc: 0.541 - ETA: 0s - loss: 7.2945 - acc: 0.543 - ETA: 0s - loss: 7.2712 - acc: 0.545 - ETA: 0s - loss: 7.2768 - acc: 0.545 - ETA: 0s - loss: 7.2805 - acc: 0.544 - ETA: 0s - loss: 7.2938 - acc: 0.543 - ETA: 0s - loss: 7.2864 - acc: 0.544 - ETA: 0s - loss: 7.3094 - acc: 0.543 - ETA: 0s - loss: 7.3169 - acc: 0.542 - 1s 201us/step - loss: 7.3360 - acc: 0.5416 - val_loss: 7.7806 - val_acc: 0.4802 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 1s - loss: 5.9643 - acc: 0.630 - ETA: 1s - loss: 7.0124 - acc: 0.565 - ETA: 1s - loss: 7.2310 - acc: 0.551 - ETA: 1s - loss: 6.9390 - acc: 0.566 - ETA: 0s - loss: 7.1481 - acc: 0.553 - ETA: 0s - loss: 7.1187 - acc: 0.555 - ETA: 0s - loss: 7.1890 - acc: 0.550 - ETA: 0s - loss: 7.3117 - acc: 0.542 - ETA: 0s - loss: 7.2677 - acc: 0.545 - ETA: 0s - loss: 7.1917 - acc: 0.550 - ETA: 0s - loss: 7.1822 - acc: 0.551 - ETA: 0s - loss: 7.1601 - acc: 0.553 - ETA: 0s - loss: 7.1547 - acc: 0.553 - ETA: 0s - loss: 7.1992 - acc: 0.551 - ETA: 0s - loss: 7.1997 - acc: 0.550 - ETA: 0s - loss: 7.2144 - acc: 0.550 - ETA: 0s - loss: 7.2599 - acc: 0.547 - ETA: 0s - loss: 7.2879 - acc: 0.545 - ETA: 0s - loss: 7.2460 - acc: 0.547 - ETA: 0s - loss: 7.2554 - acc: 0.547 - ETA: 0s - loss: 7.2450 - acc: 0.547 - ETA: 0s - loss: 7.2779 - acc: 0.545 - 1s 201us/step - loss: 7.3340 - acc: 0.5419 - val_loss: 7.7903 - val_acc: 0.4838 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 1s - loss: 5.4921 - acc: 0.650 - ETA: 1s - loss: 7.1372 - acc: 0.555 - ETA: 1s - loss: 7.3743 - acc: 0.540 - ETA: 1s - loss: 7.2898 - acc: 0.546 - ETA: 1s - loss: 7.5170 - acc: 0.532 - ETA: 0s - loss: 7.5084 - acc: 0.533 - ETA: 0s - loss: 7.3495 - acc: 0.543 - ETA: 0s - loss: 7.2924 - acc: 0.546 - ETA: 0s - loss: 7.3529 - acc: 0.542 - ETA: 0s - loss: 7.3552 - acc: 0.542 - ETA: 0s - loss: 7.3107 - acc: 0.544 - ETA: 0s - loss: 7.2734 - acc: 0.547 - ETA: 0s - loss: 7.2997 - acc: 0.545 - ETA: 0s - loss: 7.2671 - acc: 0.546 - ETA: 0s - loss: 7.3129 - acc: 0.543 - ETA: 0s - loss: 7.3338 - acc: 0.542 - ETA: 0s - loss: 7.3190 - acc: 0.543 - ETA: 0s - loss: 7.3246 - acc: 0.542 - ETA: 0s - loss: 7.3433 - acc: 0.540 - ETA: 0s - loss: 7.3276 - acc: 0.542 - ETA: 0s - loss: 7.3013 - acc: 0.543 - ETA: 0s - loss: 7.2564 - acc: 0.546 - 1s 203us/step - loss: 7.3336 - acc: 0.5419 - val_loss: 7.8359 - val_acc: 0.4838 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 1s - loss: 5.6495 - acc: 0.650 - ETA: 1s - loss: 6.9738 - acc: 0.567 - ETA: 1s - loss: 7.0939 - acc: 0.560 - ETA: 1s - loss: 7.1685 - acc: 0.553 - ETA: 1s - loss: 7.1511 - acc: 0.554 - ETA: 0s - loss: 7.2816 - acc: 0.546 - ETA: 0s - loss: 7.4217 - acc: 0.537 - ETA: 0s - loss: 7.2935 - acc: 0.544 - ETA: 0s - loss: 7.2262 - acc: 0.548 - ETA: 0s - loss: 7.2009 - acc: 0.549 - ETA: 0s - loss: 7.1957 - acc: 0.550 - ETA: 0s - loss: 7.2150 - acc: 0.549 - ETA: 0s - loss: 7.1790 - acc: 0.551 - ETA: 0s - loss: 7.2007 - acc: 0.550 - ETA: 0s - loss: 7.2427 - acc: 0.547 - ETA: 0s - loss: 7.3158 - acc: 0.542 - ETA: 0s - loss: 7.3177 - acc: 0.542 - ETA: 0s - loss: 7.3328 - acc: 0.541 - ETA: 0s - loss: 7.3126 - acc: 0.542 - ETA: 0s - loss: 7.3181 - acc: 0.542 - ETA: 0s - loss: 7.3155 - acc: 0.542 - ETA: 0s - loss: 7.3242 - acc: 0.541 - 1s 200us/step - loss: 7.3333 - acc: 0.5415 - val_loss: 7.7724 - val_acc: 0.4886 Epoch 00020: val_loss did not improve we are at InceptionV3_model4 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 39s - loss: 7.0206 - acc: 0.0000e+ - ETA: 13s - loss: 10.8825 - acc: 0.0333 - ETA: 8s - loss: 11.8419 - acc: 0.048 - ETA: 6s - loss: 11.9866 - acc: 0.06 - ETA: 5s - loss: 11.8423 - acc: 0.09 - ETA: 4s - loss: 11.6414 - acc: 0.12 - ETA: 3s - loss: 11.5364 - acc: 0.14 - ETA: 3s - loss: 11.5313 - acc: 0.14 - ETA: 3s - loss: 11.3960 - acc: 0.16 - ETA: 2s - loss: 11.3492 - acc: 0.17 - ETA: 2s - loss: 11.3380 - acc: 0.18 - ETA: 2s - loss: 11.2707 - acc: 0.19 - ETA: 2s - loss: 11.2205 - acc: 0.20 - ETA: 1s - loss: 11.1828 - acc: 0.20 - ETA: 1s - loss: 11.1646 - acc: 0.21 - ETA: 1s - loss: 11.0712 - acc: 0.21 - ETA: 1s - loss: 11.0249 - acc: 0.22 - ETA: 1s - loss: 10.9537 - acc: 0.23 - ETA: 1s - loss: 10.9072 - acc: 0.23 - ETA: 1s - loss: 10.8844 - acc: 0.24 - ETA: 1s - loss: 10.8236 - acc: 0.24 - ETA: 0s - loss: 10.8054 - acc: 0.24 - ETA: 0s - loss: 10.7727 - acc: 0.25 - ETA: 0s - loss: 10.6943 - acc: 0.25 - ETA: 0s - loss: 10.6528 - acc: 0.26 - ETA: 0s - loss: 10.6221 - acc: 0.26 - ETA: 0s - loss: 10.6215 - acc: 0.26 - ETA: 0s - loss: 10.5848 - acc: 0.27 - ETA: 0s - loss: 10.5641 - acc: 0.27 - ETA: 0s - loss: 10.5717 - acc: 0.27 - ETA: 0s - loss: 10.5515 - acc: 0.27 - ETA: 0s - loss: 10.5729 - acc: 0.27 - ETA: 0s - loss: 10.5378 - acc: 0.27 - 3s 395us/step - loss: 10.5314 - acc: 0.2793 - val_loss: 10.0427 - val_acc: 0.3329 Epoch 00001: val_loss improved from inf to 10.04272, saving model to saved_models/weights.best.InceptionV34.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 1s - loss: 9.0685 - acc: 0.390 - ETA: 1s - loss: 9.3659 - acc: 0.390 - ETA: 1s - loss: 9.4228 - acc: 0.392 - ETA: 1s - loss: 9.5910 - acc: 0.384 - ETA: 1s - loss: 9.5644 - acc: 0.385 - ETA: 1s - loss: 9.6258 - acc: 0.382 - ETA: 1s - loss: 9.5173 - acc: 0.384 - ETA: 1s - loss: 9.5772 - acc: 0.382 - ETA: 1s - loss: 9.6150 - acc: 0.381 - ETA: 1s - loss: 9.5979 - acc: 0.383 - ETA: 1s - loss: 9.6409 - acc: 0.380 - ETA: 1s - loss: 9.6380 - acc: 0.378 - ETA: 1s - loss: 9.6267 - acc: 0.378 - ETA: 1s - loss: 9.6487 - acc: 0.375 - ETA: 1s - loss: 9.6991 - acc: 0.372 - ETA: 1s - loss: 9.7307 - acc: 0.370 - ETA: 0s - loss: 9.6948 - acc: 0.373 - ETA: 0s - loss: 9.6791 - acc: 0.374 - ETA: 0s - loss: 9.6875 - acc: 0.371 - ETA: 0s - loss: 9.7196 - acc: 0.370 - ETA: 0s - loss: 9.7261 - acc: 0.370 - ETA: 0s - loss: 9.7676 - acc: 0.367 - ETA: 0s - loss: 9.7465 - acc: 0.369 - ETA: 0s - loss: 9.7938 - acc: 0.367 - ETA: 0s - loss: 9.8489 - acc: 0.363 - ETA: 0s - loss: 9.8252 - acc: 0.365 - ETA: 0s - loss: 9.8029 - acc: 0.366 - ETA: 0s - loss: 9.8081 - acc: 0.365 - ETA: 0s - loss: 9.8409 - acc: 0.362 - ETA: 0s - loss: 9.8122 - acc: 0.364 - ETA: 0s - loss: 9.7894 - acc: 0.365 - ETA: 0s - loss: 9.8317 - acc: 0.363 - ETA: 0s - loss: 9.8565 - acc: 0.361 - 2s 300us/step - loss: 9.8588 - acc: 0.3614 - val_loss: 9.8202 - val_acc: 0.3629 Epoch 00002: val_loss improved from 10.04272 to 9.82022, saving model to saved_models/weights.best.InceptionV34.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 1s - loss: 10.8413 - acc: 0.31 - ETA: 1s - loss: 10.0708 - acc: 0.36 - ETA: 1s - loss: 9.9489 - acc: 0.3660 - ETA: 1s - loss: 9.8914 - acc: 0.371 - ETA: 1s - loss: 9.8609 - acc: 0.373 - ETA: 1s - loss: 9.7590 - acc: 0.380 - ETA: 1s - loss: 9.6504 - acc: 0.386 - ETA: 1s - loss: 9.6669 - acc: 0.385 - ETA: 1s - loss: 9.6041 - acc: 0.388 - ETA: 1s - loss: 9.6559 - acc: 0.384 - ETA: 1s - loss: 9.6101 - acc: 0.387 - ETA: 1s - loss: 9.6569 - acc: 0.382 - ETA: 1s - loss: 9.6571 - acc: 0.380 - ETA: 1s - loss: 9.6167 - acc: 0.383 - ETA: 1s - loss: 9.6370 - acc: 0.382 - ETA: 1s - loss: 9.6059 - acc: 0.384 - ETA: 0s - loss: 9.6217 - acc: 0.383 - ETA: 0s - loss: 9.5820 - acc: 0.386 - ETA: 0s - loss: 9.6011 - acc: 0.385 - ETA: 0s - loss: 9.5741 - acc: 0.386 - ETA: 0s - loss: 9.5400 - acc: 0.388 - ETA: 0s - loss: 9.5365 - acc: 0.388 - ETA: 0s - loss: 9.5914 - acc: 0.385 - ETA: 0s - loss: 9.6133 - acc: 0.384 - ETA: 0s - loss: 9.6069 - acc: 0.385 - ETA: 0s - loss: 9.6028 - acc: 0.385 - ETA: 0s - loss: 9.6081 - acc: 0.384 - ETA: 0s - loss: 9.6356 - acc: 0.382 - ETA: 0s - loss: 9.6369 - acc: 0.383 - ETA: 0s - loss: 9.6424 - acc: 0.382 - ETA: 0s - loss: 9.6229 - acc: 0.383 - ETA: 0s - loss: 9.6462 - acc: 0.382 - ETA: 0s - loss: 9.6435 - acc: 0.382 - 2s 298us/step - loss: 9.6460 - acc: 0.3825 - val_loss: 9.8953 - val_acc: 0.3629 Epoch 00003: val_loss did not improve Epoch 4/20 6680/6680 [==============================] - ETA: 1s - loss: 11.3144 - acc: 0.28 - ETA: 1s - loss: 10.2619 - acc: 0.34 - ETA: 1s - loss: 10.2707 - acc: 0.34 - ETA: 1s - loss: 9.9861 - acc: 0.3643 - ETA: 1s - loss: 9.6420 - acc: 0.386 - ETA: 1s - loss: 9.6499 - acc: 0.387 - ETA: 1s - loss: 9.5843 - acc: 0.390 - ETA: 1s - loss: 9.6739 - acc: 0.384 - ETA: 1s - loss: 9.7144 - acc: 0.382 - ETA: 1s - loss: 9.7402 - acc: 0.381 - ETA: 1s - loss: 9.7007 - acc: 0.384 - ETA: 1s - loss: 9.7282 - acc: 0.381 - ETA: 1s - loss: 9.7044 - acc: 0.383 - ETA: 1s - loss: 9.6403 - acc: 0.387 - ETA: 1s - loss: 9.6431 - acc: 0.386 - ETA: 1s - loss: 9.6685 - acc: 0.384 - ETA: 0s - loss: 9.7057 - acc: 0.382 - ETA: 0s - loss: 9.7154 - acc: 0.382 - ETA: 0s - loss: 9.7466 - acc: 0.380 - ETA: 0s - loss: 9.6964 - acc: 0.383 - ETA: 0s - loss: 9.7193 - acc: 0.382 - ETA: 0s - loss: 9.7201 - acc: 0.382 - ETA: 0s - loss: 9.7079 - acc: 0.382 - ETA: 0s - loss: 9.6755 - acc: 0.385 - ETA: 0s - loss: 9.6707 - acc: 0.385 - ETA: 0s - loss: 9.6861 - acc: 0.384 - ETA: 0s - loss: 9.6772 - acc: 0.384 - ETA: 0s - loss: 9.6354 - acc: 0.387 - ETA: 0s - loss: 9.6529 - acc: 0.387 - ETA: 0s - loss: 9.6305 - acc: 0.388 - ETA: 0s - loss: 9.6114 - acc: 0.389 - ETA: 0s - loss: 9.6144 - acc: 0.389 - ETA: 0s - loss: 9.5973 - acc: 0.390 - 2s 302us/step - loss: 9.5777 - acc: 0.3907 - val_loss: 9.7871 - val_acc: 0.3725 Epoch 00004: val_loss improved from 9.82022 to 9.78705, saving model to saved_models/weights.best.InceptionV34.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 1s - loss: 8.2651 - acc: 0.480 - ETA: 1s - loss: 9.1250 - acc: 0.423 - ETA: 1s - loss: 9.3222 - acc: 0.408 - ETA: 1s - loss: 9.7899 - acc: 0.381 - ETA: 1s - loss: 9.7495 - acc: 0.385 - ETA: 1s - loss: 9.6335 - acc: 0.393 - ETA: 1s - loss: 9.5826 - acc: 0.397 - ETA: 1s - loss: 9.4986 - acc: 0.403 - ETA: 1s - loss: 9.5419 - acc: 0.400 - ETA: 1s - loss: 9.5519 - acc: 0.399 - ETA: 1s - loss: 9.5392 - acc: 0.399 - ETA: 1s - loss: 9.4935 - acc: 0.402 - ETA: 1s - loss: 9.4847 - acc: 0.402 - ETA: 1s - loss: 9.5293 - acc: 0.398 - ETA: 1s - loss: 9.5289 - acc: 0.398 - ETA: 0s - loss: 9.4898 - acc: 0.400 - ETA: 0s - loss: 9.5001 - acc: 0.399 - ETA: 0s - loss: 9.4637 - acc: 0.401 - ETA: 0s - loss: 9.5055 - acc: 0.399 - ETA: 0s - loss: 9.5265 - acc: 0.398 - ETA: 0s - loss: 9.4948 - acc: 0.401 - ETA: 0s - loss: 9.4928 - acc: 0.401 - ETA: 0s - loss: 9.4931 - acc: 0.401 - ETA: 0s - loss: 9.5078 - acc: 0.400 - ETA: 0s - loss: 9.4956 - acc: 0.401 - ETA: 0s - loss: 9.5066 - acc: 0.401 - ETA: 0s - loss: 9.5215 - acc: 0.400 - ETA: 0s - loss: 9.5416 - acc: 0.399 - ETA: 0s - loss: 9.5300 - acc: 0.400 - ETA: 0s - loss: 9.4841 - acc: 0.402 - ETA: 0s - loss: 9.4925 - acc: 0.402 - ETA: 0s - loss: 9.5135 - acc: 0.401 - ETA: 0s - loss: 9.5063 - acc: 0.401 - 2s 298us/step - loss: 9.5120 - acc: 0.4013 - val_loss: 9.8160 - val_acc: 0.3677 Epoch 00005: val_loss did not improve Epoch 6/20 6680/6680 [==============================] - ETA: 1s - loss: 7.9197 - acc: 0.500 - ETA: 1s - loss: 8.9909 - acc: 0.433 - ETA: 1s - loss: 9.1266 - acc: 0.424 - ETA: 1s - loss: 9.2218 - acc: 0.418 - ETA: 1s - loss: 9.1520 - acc: 0.423 - ETA: 1s - loss: 9.4686 - acc: 0.403 - ETA: 1s - loss: 9.5129 - acc: 0.400 - ETA: 1s - loss: 9.5939 - acc: 0.394 - ETA: 1s - loss: 9.4935 - acc: 0.401 - ETA: 1s - loss: 9.5739 - acc: 0.396 - ETA: 1s - loss: 9.5729 - acc: 0.395 - ETA: 1s - loss: 9.6317 - acc: 0.392 - ETA: 1s - loss: 9.6292 - acc: 0.393 - ETA: 1s - loss: 9.5786 - acc: 0.397 - ETA: 1s - loss: 9.5905 - acc: 0.396 - ETA: 1s - loss: 9.5751 - acc: 0.398 - ETA: 0s - loss: 9.5633 - acc: 0.399 - ETA: 0s - loss: 9.5823 - acc: 0.398 - ETA: 0s - loss: 9.6053 - acc: 0.396 - ETA: 0s - loss: 9.5749 - acc: 0.398 - ETA: 0s - loss: 9.5374 - acc: 0.400 - ETA: 0s - loss: 9.5210 - acc: 0.400 - ETA: 0s - loss: 9.5166 - acc: 0.400 - ETA: 0s - loss: 9.4925 - acc: 0.402 - ETA: 0s - loss: 9.4998 - acc: 0.402 - ETA: 0s - loss: 9.5107 - acc: 0.401 - ETA: 0s - loss: 9.4674 - acc: 0.404 - ETA: 0s - loss: 9.4515 - acc: 0.405 - ETA: 0s - loss: 9.4847 - acc: 0.403 - ETA: 0s - loss: 9.4855 - acc: 0.404 - ETA: 0s - loss: 9.5108 - acc: 0.402 - ETA: 0s - loss: 9.4834 - acc: 0.404 - ETA: 0s - loss: 9.4845 - acc: 0.404 - 2s 300us/step - loss: 9.4921 - acc: 0.4040 - val_loss: 9.7381 - val_acc: 0.3772 Epoch 00006: val_loss improved from 9.78705 to 9.73815, saving model to saved_models/weights.best.InceptionV34.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 1s - loss: 9.3495 - acc: 0.420 - ETA: 1s - loss: 9.5100 - acc: 0.410 - ETA: 1s - loss: 9.4807 - acc: 0.410 - ETA: 1s - loss: 9.3742 - acc: 0.417 - ETA: 1s - loss: 9.3827 - acc: 0.416 - ETA: 1s - loss: 9.5231 - acc: 0.408 - ETA: 1s - loss: 9.6326 - acc: 0.401 - ETA: 1s - loss: 9.5411 - acc: 0.407 - ETA: 1s - loss: 9.5566 - acc: 0.406 - ETA: 1s - loss: 9.5778 - acc: 0.404 - ETA: 1s - loss: 9.6100 - acc: 0.401 - ETA: 1s - loss: 9.5161 - acc: 0.406 - ETA: 1s - loss: 9.4488 - acc: 0.410 - ETA: 1s - loss: 9.4215 - acc: 0.411 - ETA: 1s - loss: 9.4225 - acc: 0.411 - ETA: 1s - loss: 9.5155 - acc: 0.405 - ETA: 0s - loss: 9.4830 - acc: 0.407 - ETA: 0s - loss: 9.4598 - acc: 0.408 - ETA: 0s - loss: 9.5236 - acc: 0.404 - ETA: 0s - loss: 9.5074 - acc: 0.405 - ETA: 0s - loss: 9.4733 - acc: 0.407 - ETA: 0s - loss: 9.5089 - acc: 0.405 - ETA: 0s - loss: 9.5125 - acc: 0.405 - ETA: 0s - loss: 9.4529 - acc: 0.408 - ETA: 0s - loss: 9.4171 - acc: 0.411 - ETA: 0s - loss: 9.4334 - acc: 0.410 - ETA: 0s - loss: 9.4487 - acc: 0.409 - ETA: 0s - loss: 9.4602 - acc: 0.408 - ETA: 0s - loss: 9.4423 - acc: 0.408 - ETA: 0s - loss: 9.4593 - acc: 0.407 - ETA: 0s - loss: 9.4611 - acc: 0.407 - ETA: 0s - loss: 9.4763 - acc: 0.406 - ETA: 0s - loss: 9.4684 - acc: 0.407 - 2s 300us/step - loss: 9.4913 - acc: 0.4058 - val_loss: 9.7557 - val_acc: 0.3760 Epoch 00007: val_loss did not improve Epoch 8/20 6680/6680 [==============================] - ETA: 1s - loss: 9.9964 - acc: 0.380 - ETA: 1s - loss: 9.5110 - acc: 0.410 - ETA: 1s - loss: 9.5755 - acc: 0.406 - ETA: 1s - loss: 9.6499 - acc: 0.400 - ETA: 1s - loss: 9.6505 - acc: 0.398 - ETA: 1s - loss: 9.5873 - acc: 0.401 - ETA: 1s - loss: 9.6160 - acc: 0.399 - ETA: 1s - loss: 9.6226 - acc: 0.398 - ETA: 1s - loss: 9.6195 - acc: 0.398 - ETA: 1s - loss: 9.6419 - acc: 0.397 - ETA: 1s - loss: 9.5934 - acc: 0.401 - ETA: 1s - loss: 9.5517 - acc: 0.403 - ETA: 1s - loss: 9.5680 - acc: 0.402 - ETA: 1s - loss: 9.5234 - acc: 0.405 - ETA: 1s - loss: 9.5226 - acc: 0.405 - ETA: 0s - loss: 9.4698 - acc: 0.409 - ETA: 0s - loss: 9.4233 - acc: 0.412 - ETA: 0s - loss: 9.4340 - acc: 0.411 - ETA: 0s - loss: 9.4119 - acc: 0.413 - ETA: 0s - loss: 9.4087 - acc: 0.413 - ETA: 0s - loss: 9.4415 - acc: 0.411 - ETA: 0s - loss: 9.4997 - acc: 0.407 - ETA: 0s - loss: 9.4787 - acc: 0.409 - ETA: 0s - loss: 9.4670 - acc: 0.409 - ETA: 0s - loss: 9.4722 - acc: 0.409 - ETA: 0s - loss: 9.4642 - acc: 0.410 - ETA: 0s - loss: 9.4787 - acc: 0.409 - ETA: 0s - loss: 9.4807 - acc: 0.408 - ETA: 0s - loss: 9.4864 - acc: 0.408 - ETA: 0s - loss: 9.5026 - acc: 0.406 - ETA: 0s - loss: 9.5161 - acc: 0.406 - ETA: 0s - loss: 9.5302 - acc: 0.405 - ETA: 0s - loss: 9.4979 - acc: 0.407 - 2s 298us/step - loss: 9.4881 - acc: 0.4081 - val_loss: 9.7231 - val_acc: 0.3796 Epoch 00008: val_loss improved from 9.73815 to 9.72314, saving model to saved_models/weights.best.InceptionV34.hdf5 Epoch 9/20 6680/6680 [==============================] - ETA: 1s - loss: 8.3815 - acc: 0.480 - ETA: 1s - loss: 8.3814 - acc: 0.480 - ETA: 1s - loss: 9.0264 - acc: 0.440 - ETA: 1s - loss: 9.1879 - acc: 0.430 - ETA: 1s - loss: 9.4206 - acc: 0.415 - ETA: 1s - loss: 9.5101 - acc: 0.410 - ETA: 1s - loss: 9.4625 - acc: 0.412 - ETA: 1s - loss: 9.4919 - acc: 0.410 - ETA: 1s - loss: 9.4466 - acc: 0.412 - ETA: 1s - loss: 9.4376 - acc: 0.412 - ETA: 1s - loss: 9.3546 - acc: 0.417 - ETA: 1s - loss: 9.3761 - acc: 0.415 - ETA: 1s - loss: 9.3841 - acc: 0.415 - ETA: 1s - loss: 9.3696 - acc: 0.416 - ETA: 1s - loss: 9.3182 - acc: 0.419 - ETA: 1s - loss: 9.4086 - acc: 0.414 - ETA: 0s - loss: 9.4197 - acc: 0.413 - ETA: 0s - loss: 9.5216 - acc: 0.407 - ETA: 0s - loss: 9.5429 - acc: 0.405 - ETA: 0s - loss: 9.5371 - acc: 0.406 - ETA: 0s - loss: 9.5516 - acc: 0.405 - ETA: 0s - loss: 9.5748 - acc: 0.404 - ETA: 0s - loss: 9.5586 - acc: 0.405 - ETA: 0s - loss: 9.5121 - acc: 0.408 - ETA: 0s - loss: 9.5350 - acc: 0.406 - ETA: 0s - loss: 9.5435 - acc: 0.406 - ETA: 0s - loss: 9.5328 - acc: 0.406 - ETA: 0s - loss: 9.5564 - acc: 0.404 - ETA: 0s - loss: 9.5531 - acc: 0.405 - ETA: 0s - loss: 9.5300 - acc: 0.406 - ETA: 0s - loss: 9.5136 - acc: 0.407 - ETA: 0s - loss: 9.5113 - acc: 0.407 - ETA: 0s - loss: 9.5113 - acc: 0.407 - 2s 300us/step - loss: 9.4746 - acc: 0.4099 - val_loss: 9.7055 - val_acc: 0.3856 Epoch 00009: val_loss improved from 9.72314 to 9.70552, saving model to saved_models/weights.best.InceptionV34.hdf5 Epoch 10/20 6680/6680 [==============================] - ETA: 1s - loss: 9.0263 - acc: 0.440 - ETA: 1s - loss: 9.3505 - acc: 0.420 - ETA: 1s - loss: 9.2530 - acc: 0.426 - ETA: 1s - loss: 9.2342 - acc: 0.427 - ETA: 1s - loss: 9.3315 - acc: 0.421 - ETA: 1s - loss: 9.3053 - acc: 0.422 - ETA: 1s - loss: 9.3867 - acc: 0.417 - ETA: 1s - loss: 9.2419 - acc: 0.426 - ETA: 1s - loss: 9.3587 - acc: 0.419 - ETA: 1s - loss: 9.3576 - acc: 0.419 - ETA: 1s - loss: 9.3261 - acc: 0.421 - ETA: 1s - loss: 9.3070 - acc: 0.422 - ETA: 1s - loss: 9.3426 - acc: 0.420 - ETA: 1s - loss: 9.2773 - acc: 0.424 - ETA: 1s - loss: 9.2773 - acc: 0.424 - ETA: 1s - loss: 9.2140 - acc: 0.427 - ETA: 0s - loss: 9.2320 - acc: 0.426 - ETA: 0s - loss: 9.2679 - acc: 0.424 - ETA: 0s - loss: 9.2620 - acc: 0.424 - ETA: 0s - loss: 9.2792 - acc: 0.423 - ETA: 0s - loss: 9.3536 - acc: 0.418 - ETA: 0s - loss: 9.3493 - acc: 0.418 - ETA: 0s - loss: 9.3780 - acc: 0.416 - ETA: 0s - loss: 9.3939 - acc: 0.416 - ETA: 0s - loss: 9.4118 - acc: 0.414 - ETA: 0s - loss: 9.4381 - acc: 0.413 - ETA: 0s - loss: 9.4682 - acc: 0.411 - ETA: 0s - loss: 9.4665 - acc: 0.411 - ETA: 0s - loss: 9.4426 - acc: 0.413 - ETA: 0s - loss: 9.4349 - acc: 0.413 - ETA: 0s - loss: 9.4479 - acc: 0.412 - ETA: 0s - loss: 9.4789 - acc: 0.410 - ETA: 0s - loss: 9.4881 - acc: 0.409 - 2s 300us/step - loss: 9.4719 - acc: 0.4106 - val_loss: 9.6965 - val_acc: 0.3832 Epoch 00010: val_loss improved from 9.70552 to 9.69648, saving model to saved_models/weights.best.InceptionV34.hdf5 Epoch 11/20 6680/6680 [==============================] - ETA: 1s - loss: 9.1873 - acc: 0.430 - ETA: 1s - loss: 9.0261 - acc: 0.440 - ETA: 1s - loss: 9.3163 - acc: 0.422 - ETA: 1s - loss: 9.1648 - acc: 0.431 - ETA: 1s - loss: 9.3668 - acc: 0.418 - ETA: 1s - loss: 9.5112 - acc: 0.409 - ETA: 1s - loss: 9.5366 - acc: 0.406 - ETA: 1s - loss: 9.5760 - acc: 0.404 - ETA: 1s - loss: 9.4734 - acc: 0.411 - ETA: 1s - loss: 9.4433 - acc: 0.413 - ETA: 1s - loss: 9.3884 - acc: 0.416 - ETA: 1s - loss: 9.4550 - acc: 0.412 - ETA: 1s - loss: 9.4594 - acc: 0.412 - ETA: 1s - loss: 9.4635 - acc: 0.412 - ETA: 1s - loss: 9.4111 - acc: 0.415 - ETA: 1s - loss: 9.4457 - acc: 0.413 - ETA: 0s - loss: 9.4643 - acc: 0.411 - ETA: 0s - loss: 9.5188 - acc: 0.408 - ETA: 0s - loss: 9.5079 - acc: 0.408 - ETA: 0s - loss: 9.5286 - acc: 0.407 - ETA: 0s - loss: 9.5238 - acc: 0.407 - ETA: 0s - loss: 9.5382 - acc: 0.407 - ETA: 0s - loss: 9.5155 - acc: 0.408 - ETA: 0s - loss: 9.4811 - acc: 0.410 - ETA: 0s - loss: 9.5219 - acc: 0.408 - ETA: 0s - loss: 9.4520 - acc: 0.412 - ETA: 0s - loss: 9.4421 - acc: 0.413 - ETA: 0s - loss: 9.4358 - acc: 0.413 - ETA: 0s - loss: 9.4051 - acc: 0.415 - ETA: 0s - loss: 9.4091 - acc: 0.415 - ETA: 0s - loss: 9.3966 - acc: 0.415 - ETA: 0s - loss: 9.4068 - acc: 0.415 - ETA: 0s - loss: 9.4397 - acc: 0.413 - 2s 298us/step - loss: 9.4629 - acc: 0.4118 - val_loss: 9.7311 - val_acc: 0.3844 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 1s - loss: 8.3831 - acc: 0.480 - ETA: 1s - loss: 8.4939 - acc: 0.470 - ETA: 1s - loss: 9.2641 - acc: 0.422 - ETA: 1s - loss: 9.2004 - acc: 0.425 - ETA: 1s - loss: 9.4689 - acc: 0.408 - ETA: 1s - loss: 9.4209 - acc: 0.410 - ETA: 1s - loss: 9.3852 - acc: 0.413 - ETA: 1s - loss: 9.6509 - acc: 0.397 - ETA: 1s - loss: 9.6343 - acc: 0.398 - ETA: 1s - loss: 9.6467 - acc: 0.398 - ETA: 1s - loss: 9.6797 - acc: 0.396 - ETA: 1s - loss: 9.7630 - acc: 0.391 - ETA: 1s - loss: 9.7040 - acc: 0.395 - ETA: 1s - loss: 9.6777 - acc: 0.397 - ETA: 1s - loss: 9.6329 - acc: 0.400 - ETA: 1s - loss: 9.5366 - acc: 0.406 - ETA: 0s - loss: 9.5056 - acc: 0.408 - ETA: 0s - loss: 9.4828 - acc: 0.410 - ETA: 0s - loss: 9.4756 - acc: 0.410 - ETA: 0s - loss: 9.4856 - acc: 0.410 - ETA: 0s - loss: 9.4676 - acc: 0.411 - ETA: 0s - loss: 9.4551 - acc: 0.411 - ETA: 0s - loss: 9.4468 - acc: 0.412 - ETA: 0s - loss: 9.4598 - acc: 0.411 - ETA: 0s - loss: 9.4552 - acc: 0.411 - ETA: 0s - loss: 9.4858 - acc: 0.410 - ETA: 0s - loss: 9.4928 - acc: 0.409 - ETA: 0s - loss: 9.4788 - acc: 0.410 - ETA: 0s - loss: 9.4572 - acc: 0.411 - ETA: 0s - loss: 9.4426 - acc: 0.412 - ETA: 0s - loss: 9.4501 - acc: 0.412 - ETA: 0s - loss: 9.4418 - acc: 0.413 - ETA: 0s - loss: 9.4339 - acc: 0.413 - 2s 300us/step - loss: 9.4620 - acc: 0.4118 - val_loss: 9.7082 - val_acc: 0.3820 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 1s - loss: 9.0261 - acc: 0.440 - ETA: 1s - loss: 9.1873 - acc: 0.430 - ETA: 1s - loss: 9.1873 - acc: 0.430 - ETA: 1s - loss: 9.1182 - acc: 0.434 - ETA: 1s - loss: 9.2231 - acc: 0.427 - ETA: 1s - loss: 9.2606 - acc: 0.425 - ETA: 1s - loss: 9.2741 - acc: 0.424 - ETA: 1s - loss: 9.2625 - acc: 0.425 - ETA: 1s - loss: 9.3390 - acc: 0.420 - ETA: 1s - loss: 9.4079 - acc: 0.416 - ETA: 1s - loss: 9.3792 - acc: 0.418 - ETA: 1s - loss: 9.3978 - acc: 0.417 - ETA: 1s - loss: 9.4454 - acc: 0.414 - ETA: 1s - loss: 9.4682 - acc: 0.412 - ETA: 1s - loss: 9.4840 - acc: 0.411 - ETA: 1s - loss: 9.5013 - acc: 0.410 - ETA: 0s - loss: 9.5173 - acc: 0.409 - ETA: 0s - loss: 9.5085 - acc: 0.408 - ETA: 0s - loss: 9.5216 - acc: 0.408 - ETA: 0s - loss: 9.5281 - acc: 0.407 - ETA: 0s - loss: 9.5115 - acc: 0.408 - ETA: 0s - loss: 9.5339 - acc: 0.407 - ETA: 0s - loss: 9.5042 - acc: 0.409 - ETA: 0s - loss: 9.5216 - acc: 0.408 - ETA: 0s - loss: 9.5217 - acc: 0.408 - ETA: 0s - loss: 9.5275 - acc: 0.407 - ETA: 0s - loss: 9.5360 - acc: 0.407 - ETA: 0s - loss: 9.5380 - acc: 0.407 - ETA: 0s - loss: 9.5455 - acc: 0.406 - ETA: 0s - loss: 9.5470 - acc: 0.406 - ETA: 0s - loss: 9.4982 - acc: 0.409 - ETA: 0s - loss: 9.4760 - acc: 0.411 - ETA: 0s - loss: 9.4572 - acc: 0.412 - 2s 300us/step - loss: 9.4630 - acc: 0.4120 - val_loss: 9.7668 - val_acc: 0.3784 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 1s - loss: 10.1544 - acc: 0.37 - ETA: 1s - loss: 9.4560 - acc: 0.4133 - ETA: 1s - loss: 8.9618 - acc: 0.444 - ETA: 1s - loss: 8.9572 - acc: 0.444 - ETA: 1s - loss: 9.2591 - acc: 0.425 - ETA: 1s - loss: 9.3633 - acc: 0.419 - ETA: 1s - loss: 9.3982 - acc: 0.416 - ETA: 1s - loss: 9.3486 - acc: 0.420 - ETA: 1s - loss: 9.3488 - acc: 0.420 - ETA: 1s - loss: 9.3234 - acc: 0.421 - ETA: 1s - loss: 9.3297 - acc: 0.421 - ETA: 1s - loss: 9.3033 - acc: 0.422 - ETA: 1s - loss: 9.2618 - acc: 0.425 - ETA: 1s - loss: 9.2508 - acc: 0.425 - ETA: 1s - loss: 9.3188 - acc: 0.421 - ETA: 1s - loss: 9.2895 - acc: 0.423 - ETA: 0s - loss: 9.3321 - acc: 0.420 - ETA: 0s - loss: 9.4409 - acc: 0.413 - ETA: 0s - loss: 9.4448 - acc: 0.413 - ETA: 0s - loss: 9.4316 - acc: 0.414 - ETA: 0s - loss: 9.4079 - acc: 0.415 - ETA: 0s - loss: 9.4352 - acc: 0.414 - ETA: 0s - loss: 9.4242 - acc: 0.414 - ETA: 0s - loss: 9.4347 - acc: 0.414 - ETA: 0s - loss: 9.4811 - acc: 0.411 - ETA: 0s - loss: 9.4628 - acc: 0.412 - ETA: 0s - loss: 9.4387 - acc: 0.413 - ETA: 0s - loss: 9.4413 - acc: 0.413 - ETA: 0s - loss: 9.4494 - acc: 0.412 - ETA: 0s - loss: 9.4187 - acc: 0.414 - ETA: 0s - loss: 9.4137 - acc: 0.415 - ETA: 0s - loss: 9.4226 - acc: 0.414 - ETA: 0s - loss: 9.4703 - acc: 0.411 - 2s 298us/step - loss: 9.4660 - acc: 0.4115 - val_loss: 9.7502 - val_acc: 0.3784 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 2s - loss: 8.8650 - acc: 0.450 - ETA: 1s - loss: 9.4036 - acc: 0.416 - ETA: 1s - loss: 9.4138 - acc: 0.416 - ETA: 1s - loss: 9.5794 - acc: 0.405 - ETA: 1s - loss: 9.6894 - acc: 0.398 - ETA: 1s - loss: 9.5849 - acc: 0.404 - ETA: 1s - loss: 9.4989 - acc: 0.410 - ETA: 1s - loss: 9.3120 - acc: 0.421 - ETA: 1s - loss: 9.3068 - acc: 0.421 - ETA: 1s - loss: 9.2688 - acc: 0.424 - ETA: 1s - loss: 9.1920 - acc: 0.429 - ETA: 1s - loss: 9.1845 - acc: 0.429 - ETA: 1s - loss: 9.2106 - acc: 0.428 - ETA: 1s - loss: 9.2149 - acc: 0.427 - ETA: 1s - loss: 9.2408 - acc: 0.426 - ETA: 1s - loss: 9.2477 - acc: 0.425 - ETA: 0s - loss: 9.2733 - acc: 0.423 - ETA: 0s - loss: 9.3328 - acc: 0.420 - ETA: 0s - loss: 9.3685 - acc: 0.417 - ETA: 0s - loss: 9.3923 - acc: 0.416 - ETA: 0s - loss: 9.3705 - acc: 0.417 - ETA: 0s - loss: 9.3920 - acc: 0.416 - ETA: 0s - loss: 9.4080 - acc: 0.415 - ETA: 0s - loss: 9.4192 - acc: 0.414 - ETA: 0s - loss: 9.4393 - acc: 0.413 - ETA: 0s - loss: 9.4326 - acc: 0.414 - ETA: 0s - loss: 9.4264 - acc: 0.414 - ETA: 0s - loss: 9.4099 - acc: 0.415 - ETA: 0s - loss: 9.4473 - acc: 0.413 - ETA: 0s - loss: 9.4604 - acc: 0.412 - ETA: 0s - loss: 9.4631 - acc: 0.412 - ETA: 0s - loss: 9.4723 - acc: 0.411 - ETA: 0s - loss: 9.4685 - acc: 0.411 - 2s 299us/step - loss: 9.4619 - acc: 0.4121 - val_loss: 9.7087 - val_acc: 0.3868 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 1s - loss: 8.7038 - acc: 0.460 - ETA: 1s - loss: 9.4022 - acc: 0.416 - ETA: 1s - loss: 9.3807 - acc: 0.418 - ETA: 1s - loss: 9.1182 - acc: 0.434 - ETA: 1s - loss: 9.0799 - acc: 0.436 - ETA: 1s - loss: 9.3046 - acc: 0.422 - ETA: 1s - loss: 9.4229 - acc: 0.415 - ETA: 1s - loss: 9.4452 - acc: 0.414 - ETA: 1s - loss: 9.3583 - acc: 0.419 - ETA: 1s - loss: 9.4082 - acc: 0.416 - ETA: 1s - loss: 9.4956 - acc: 0.410 - ETA: 1s - loss: 9.5038 - acc: 0.410 - ETA: 1s - loss: 9.5462 - acc: 0.407 - ETA: 1s - loss: 9.6091 - acc: 0.403 - ETA: 1s - loss: 9.6179 - acc: 0.402 - ETA: 1s - loss: 9.6214 - acc: 0.402 - ETA: 0s - loss: 9.5660 - acc: 0.405 - ETA: 0s - loss: 9.5357 - acc: 0.407 - ETA: 0s - loss: 9.4994 - acc: 0.409 - ETA: 0s - loss: 9.4751 - acc: 0.411 - ETA: 0s - loss: 9.4811 - acc: 0.410 - ETA: 0s - loss: 9.4786 - acc: 0.410 - ETA: 0s - loss: 9.5123 - acc: 0.408 - ETA: 0s - loss: 9.5021 - acc: 0.409 - ETA: 0s - loss: 9.5189 - acc: 0.408 - ETA: 0s - loss: 9.4806 - acc: 0.410 - ETA: 0s - loss: 9.4817 - acc: 0.410 - ETA: 0s - loss: 9.4710 - acc: 0.411 - ETA: 0s - loss: 9.4978 - acc: 0.409 - ETA: 0s - loss: 9.4763 - acc: 0.411 - ETA: 0s - loss: 9.4616 - acc: 0.412 - ETA: 0s - loss: 9.4426 - acc: 0.413 - ETA: 0s - loss: 9.4397 - acc: 0.413 - 2s 305us/step - loss: 9.4604 - acc: 0.4123 - val_loss: 9.7383 - val_acc: 0.3832 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 1s - loss: 9.3485 - acc: 0.420 - ETA: 1s - loss: 10.0601 - acc: 0.37 - ETA: 1s - loss: 9.8722 - acc: 0.3860 - ETA: 1s - loss: 9.5636 - acc: 0.404 - ETA: 1s - loss: 9.5165 - acc: 0.407 - ETA: 1s - loss: 9.5006 - acc: 0.409 - ETA: 1s - loss: 9.3780 - acc: 0.416 - ETA: 1s - loss: 9.3419 - acc: 0.419 - ETA: 1s - loss: 9.3237 - acc: 0.420 - ETA: 1s - loss: 9.3023 - acc: 0.421 - ETA: 1s - loss: 9.2990 - acc: 0.421 - ETA: 1s - loss: 9.3874 - acc: 0.416 - ETA: 1s - loss: 9.3391 - acc: 0.419 - ETA: 1s - loss: 9.2802 - acc: 0.423 - ETA: 1s - loss: 9.2404 - acc: 0.425 - ETA: 1s - loss: 9.2266 - acc: 0.426 - ETA: 0s - loss: 9.2877 - acc: 0.423 - ETA: 0s - loss: 9.2912 - acc: 0.422 - ETA: 0s - loss: 9.3553 - acc: 0.418 - ETA: 0s - loss: 9.3929 - acc: 0.416 - ETA: 0s - loss: 9.3947 - acc: 0.416 - ETA: 0s - loss: 9.4038 - acc: 0.415 - ETA: 0s - loss: 9.4336 - acc: 0.414 - ETA: 0s - loss: 9.4300 - acc: 0.414 - ETA: 0s - loss: 9.4143 - acc: 0.415 - ETA: 0s - loss: 9.4180 - acc: 0.414 - ETA: 0s - loss: 9.4398 - acc: 0.413 - ETA: 0s - loss: 9.4277 - acc: 0.414 - ETA: 0s - loss: 9.4220 - acc: 0.414 - ETA: 0s - loss: 9.4605 - acc: 0.412 - ETA: 0s - loss: 9.4648 - acc: 0.412 - ETA: 0s - loss: 9.4432 - acc: 0.413 - ETA: 0s - loss: 9.4849 - acc: 0.410 - 2s 299us/step - loss: 9.4561 - acc: 0.4127 - val_loss: 9.7131 - val_acc: 0.3820 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 1s - loss: 10.4768 - acc: 0.35 - ETA: 1s - loss: 9.5634 - acc: 0.4067 - ETA: 1s - loss: 9.5097 - acc: 0.410 - ETA: 1s - loss: 9.6478 - acc: 0.401 - ETA: 1s - loss: 9.3664 - acc: 0.418 - ETA: 1s - loss: 9.3778 - acc: 0.418 - ETA: 1s - loss: 9.3857 - acc: 0.417 - ETA: 1s - loss: 9.3592 - acc: 0.419 - ETA: 1s - loss: 9.3712 - acc: 0.418 - ETA: 1s - loss: 9.4367 - acc: 0.414 - ETA: 1s - loss: 9.3592 - acc: 0.419 - ETA: 1s - loss: 9.3723 - acc: 0.418 - ETA: 1s - loss: 9.3188 - acc: 0.421 - ETA: 1s - loss: 9.2793 - acc: 0.424 - ETA: 1s - loss: 9.3341 - acc: 0.420 - ETA: 1s - loss: 9.3662 - acc: 0.418 - ETA: 0s - loss: 9.3895 - acc: 0.417 - ETA: 0s - loss: 9.4194 - acc: 0.415 - ETA: 0s - loss: 9.4592 - acc: 0.413 - ETA: 0s - loss: 9.4287 - acc: 0.414 - ETA: 0s - loss: 9.4154 - acc: 0.415 - ETA: 0s - loss: 9.4385 - acc: 0.414 - ETA: 0s - loss: 9.4706 - acc: 0.412 - ETA: 0s - loss: 9.4722 - acc: 0.411 - ETA: 0s - loss: 9.4507 - acc: 0.413 - ETA: 0s - loss: 9.4530 - acc: 0.413 - ETA: 0s - loss: 9.4643 - acc: 0.412 - ETA: 0s - loss: 9.4923 - acc: 0.410 - ETA: 0s - loss: 9.5014 - acc: 0.410 - ETA: 0s - loss: 9.4935 - acc: 0.410 - ETA: 0s - loss: 9.4835 - acc: 0.411 - ETA: 0s - loss: 9.4817 - acc: 0.411 - ETA: 0s - loss: 9.4665 - acc: 0.412 - 2s 300us/step - loss: 9.4594 - acc: 0.4124 - val_loss: 9.7869 - val_acc: 0.3832 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 1s - loss: 8.5426 - acc: 0.470 - ETA: 1s - loss: 9.0863 - acc: 0.433 - ETA: 1s - loss: 9.4169 - acc: 0.414 - ETA: 1s - loss: 9.4462 - acc: 0.411 - ETA: 1s - loss: 9.5499 - acc: 0.405 - ETA: 1s - loss: 9.4986 - acc: 0.409 - ETA: 1s - loss: 9.4879 - acc: 0.410 - ETA: 1s - loss: 9.5016 - acc: 0.409 - ETA: 1s - loss: 9.4172 - acc: 0.414 - ETA: 1s - loss: 9.3591 - acc: 0.418 - ETA: 1s - loss: 9.3734 - acc: 0.417 - ETA: 1s - loss: 9.3152 - acc: 0.421 - ETA: 1s - loss: 9.2727 - acc: 0.424 - ETA: 1s - loss: 9.2903 - acc: 0.423 - ETA: 1s - loss: 9.3332 - acc: 0.420 - ETA: 1s - loss: 9.3498 - acc: 0.419 - ETA: 0s - loss: 9.3546 - acc: 0.419 - ETA: 0s - loss: 9.4325 - acc: 0.414 - ETA: 0s - loss: 9.4410 - acc: 0.413 - ETA: 0s - loss: 9.4115 - acc: 0.415 - ETA: 0s - loss: 9.4517 - acc: 0.413 - ETA: 0s - loss: 9.4057 - acc: 0.416 - ETA: 0s - loss: 9.4568 - acc: 0.412 - ETA: 0s - loss: 9.4488 - acc: 0.413 - ETA: 0s - loss: 9.4612 - acc: 0.412 - ETA: 0s - loss: 9.4978 - acc: 0.410 - ETA: 0s - loss: 9.5135 - acc: 0.409 - ETA: 0s - loss: 9.4958 - acc: 0.410 - ETA: 0s - loss: 9.5132 - acc: 0.409 - ETA: 0s - loss: 9.5131 - acc: 0.409 - ETA: 0s - loss: 9.4839 - acc: 0.411 - ETA: 0s - loss: 9.4857 - acc: 0.411 - ETA: 0s - loss: 9.4716 - acc: 0.412 - 2s 302us/step - loss: 9.4577 - acc: 0.4129 - val_loss: 9.8454 - val_acc: 0.3749 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 1s - loss: 8.8653 - acc: 0.450 - ETA: 1s - loss: 9.1337 - acc: 0.433 - ETA: 1s - loss: 9.1552 - acc: 0.432 - ETA: 1s - loss: 9.1874 - acc: 0.430 - ETA: 1s - loss: 9.2769 - acc: 0.424 - ETA: 1s - loss: 9.4218 - acc: 0.415 - ETA: 1s - loss: 9.3857 - acc: 0.417 - ETA: 1s - loss: 9.3593 - acc: 0.419 - ETA: 1s - loss: 9.3959 - acc: 0.417 - ETA: 1s - loss: 9.3740 - acc: 0.418 - ETA: 1s - loss: 9.2718 - acc: 0.424 - ETA: 1s - loss: 9.2574 - acc: 0.425 - ETA: 1s - loss: 9.3356 - acc: 0.420 - ETA: 1s - loss: 9.4142 - acc: 0.415 - ETA: 1s - loss: 9.4486 - acc: 0.413 - ETA: 1s - loss: 9.5253 - acc: 0.409 - ETA: 0s - loss: 9.5048 - acc: 0.410 - ETA: 0s - loss: 9.5097 - acc: 0.410 - ETA: 0s - loss: 9.4748 - acc: 0.412 - ETA: 0s - loss: 9.4741 - acc: 0.412 - ETA: 0s - loss: 9.4248 - acc: 0.415 - ETA: 0s - loss: 9.4325 - acc: 0.414 - ETA: 0s - loss: 9.4360 - acc: 0.414 - ETA: 0s - loss: 9.4185 - acc: 0.415 - ETA: 0s - loss: 9.4091 - acc: 0.416 - ETA: 0s - loss: 9.4130 - acc: 0.415 - ETA: 0s - loss: 9.4319 - acc: 0.414 - ETA: 0s - loss: 9.4494 - acc: 0.413 - ETA: 0s - loss: 9.4628 - acc: 0.412 - ETA: 0s - loss: 9.4780 - acc: 0.411 - ETA: 0s - loss: 9.4553 - acc: 0.413 - ETA: 0s - loss: 9.4647 - acc: 0.412 - ETA: 0s - loss: 9.4537 - acc: 0.413 - 2s 298us/step - loss: 9.4547 - acc: 0.4133 - val_loss: 9.7335 - val_acc: 0.3820 Epoch 00020: val_loss did not improve we are at Resnet50_model Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 38s - loss: 5.4456 - acc: 0.0000e+ - ETA: 6s - loss: 5.0198 - acc: 0.0683 - ETA: 3s - loss: 4.3507 - acc: 0.136 - ETA: 1s - loss: 3.7935 - acc: 0.221 - ETA: 1s - loss: 3.4073 - acc: 0.281 - ETA: 0s - loss: 3.0985 - acc: 0.339 - ETA: 0s - loss: 2.8434 - acc: 0.396 - ETA: 0s - loss: 2.6465 - acc: 0.431 - ETA: 0s - loss: 2.4774 - acc: 0.463 - ETA: 0s - loss: 2.3311 - acc: 0.492 - 1s 178us/step - loss: 2.2324 - acc: 0.5111 - val_loss: 1.1462 - val_acc: 0.7018 Epoch 00001: val_loss improved from inf to 1.14624, saving model to saved_models/weights.best.Resnet50.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 0s - loss: 0.7549 - acc: 0.810 - ETA: 0s - loss: 0.7489 - acc: 0.831 - ETA: 0s - loss: 0.7519 - acc: 0.828 - ETA: 0s - loss: 0.7234 - acc: 0.835 - ETA: 0s - loss: 0.7129 - acc: 0.835 - ETA: 0s - loss: 0.7047 - acc: 0.833 - ETA: 0s - loss: 0.6870 - acc: 0.835 - ETA: 0s - loss: 0.6789 - acc: 0.834 - ETA: 0s - loss: 0.6667 - acc: 0.836 - 1s 76us/step - loss: 0.6584 - acc: 0.8364 - val_loss: 0.7596 - val_acc: 0.7880 Epoch 00002: val_loss improved from 1.14624 to 0.75959, saving model to saved_models/weights.best.Resnet50.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 0s - loss: 0.4256 - acc: 0.910 - ETA: 0s - loss: 0.3627 - acc: 0.915 - ETA: 0s - loss: 0.3656 - acc: 0.915 - ETA: 0s - loss: 0.3691 - acc: 0.914 - ETA: 0s - loss: 0.3766 - acc: 0.911 - ETA: 0s - loss: 0.3777 - acc: 0.911 - ETA: 0s - loss: 0.3701 - acc: 0.912 - ETA: 0s - loss: 0.3636 - acc: 0.912 - ETA: 0s - loss: 0.3653 - acc: 0.909 - 0s 75us/step - loss: 0.3622 - acc: 0.9103 - val_loss: 0.6668 - val_acc: 0.7892 Epoch 00003: val_loss improved from 0.75959 to 0.66684, saving model to saved_models/weights.best.Resnet50.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 0s - loss: 0.2871 - acc: 0.940 - ETA: 0s - loss: 0.2169 - acc: 0.962 - ETA: 0s - loss: 0.2178 - acc: 0.958 - ETA: 0s - loss: 0.2170 - acc: 0.958 - ETA: 0s - loss: 0.2198 - acc: 0.955 - ETA: 0s - loss: 0.2231 - acc: 0.952 - ETA: 0s - loss: 0.2232 - acc: 0.951 - ETA: 0s - loss: 0.2231 - acc: 0.951 - ETA: 0s - loss: 0.2241 - acc: 0.950 - 1s 76us/step - loss: 0.2265 - acc: 0.9485 - val_loss: 0.6352 - val_acc: 0.7964 Epoch 00004: val_loss improved from 0.66684 to 0.63516, saving model to saved_models/weights.best.Resnet50.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 0s - loss: 0.1866 - acc: 0.950 - ETA: 0s - loss: 0.1379 - acc: 0.980 - ETA: 0s - loss: 0.1345 - acc: 0.980 - ETA: 0s - loss: 0.1350 - acc: 0.978 - ETA: 0s - loss: 0.1380 - acc: 0.978 - ETA: 0s - loss: 0.1390 - acc: 0.975 - ETA: 0s - loss: 0.1426 - acc: 0.973 - ETA: 0s - loss: 0.1461 - acc: 0.971 - ETA: 0s - loss: 0.1466 - acc: 0.971 - ETA: 0s - loss: 0.1465 - acc: 0.970 - 1s 76us/step - loss: 0.1461 - acc: 0.9711 - val_loss: 0.6197 - val_acc: 0.8012 Epoch 00005: val_loss improved from 0.63516 to 0.61971, saving model to saved_models/weights.best.Resnet50.hdf5 Epoch 6/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0801 - acc: 1.000 - ETA: 0s - loss: 0.0976 - acc: 0.987 - ETA: 0s - loss: 0.0873 - acc: 0.991 - ETA: 0s - loss: 0.0918 - acc: 0.988 - ETA: 0s - loss: 0.0944 - acc: 0.989 - ETA: 0s - loss: 0.0951 - acc: 0.987 - ETA: 0s - loss: 0.0969 - acc: 0.986 - ETA: 0s - loss: 0.0971 - acc: 0.985 - ETA: 0s - loss: 0.0990 - acc: 0.984 - 1s 76us/step - loss: 0.1012 - acc: 0.9828 - val_loss: 0.5777 - val_acc: 0.8144 Epoch 00006: val_loss improved from 0.61971 to 0.57768, saving model to saved_models/weights.best.Resnet50.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0584 - acc: 0.980 - ETA: 0s - loss: 0.0524 - acc: 0.996 - ETA: 0s - loss: 0.0556 - acc: 0.995 - ETA: 0s - loss: 0.0593 - acc: 0.992 - ETA: 0s - loss: 0.0596 - acc: 0.992 - ETA: 0s - loss: 0.0613 - acc: 0.991 - ETA: 0s - loss: 0.0628 - acc: 0.991 - ETA: 0s - loss: 0.0634 - acc: 0.992 - ETA: 0s - loss: 0.0651 - acc: 0.992 - ETA: 0s - loss: 0.0663 - acc: 0.992 - 1s 77us/step - loss: 0.0669 - acc: 0.9916 - val_loss: 0.5686 - val_acc: 0.8216 Epoch 00007: val_loss improved from 0.57768 to 0.56864, saving model to saved_models/weights.best.Resnet50.hdf5 Epoch 8/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0302 - acc: 1.000 - ETA: 0s - loss: 0.0402 - acc: 0.998 - ETA: 0s - loss: 0.0387 - acc: 0.997 - ETA: 0s - loss: 0.0375 - acc: 0.997 - ETA: 0s - loss: 0.0408 - acc: 0.995 - ETA: 0s - loss: 0.0424 - acc: 0.995 - ETA: 0s - loss: 0.0434 - acc: 0.995 - ETA: 0s - loss: 0.0462 - acc: 0.994 - ETA: 0s - loss: 0.0470 - acc: 0.994 - 1s 75us/step - loss: 0.0471 - acc: 0.9943 - val_loss: 0.5988 - val_acc: 0.8216 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0536 - acc: 0.990 - ETA: 0s - loss: 0.0296 - acc: 0.997 - ETA: 0s - loss: 0.0295 - acc: 0.996 - ETA: 0s - loss: 0.0301 - acc: 0.997 - ETA: 0s - loss: 0.0311 - acc: 0.997 - ETA: 0s - loss: 0.0348 - acc: 0.995 - ETA: 0s - loss: 0.0344 - acc: 0.995 - ETA: 0s - loss: 0.0341 - acc: 0.996 - ETA: 0s - loss: 0.0350 - acc: 0.996 - ETA: 0s - loss: 0.0346 - acc: 0.996 - 1s 76us/step - loss: 0.0349 - acc: 0.9958 - val_loss: 0.5810 - val_acc: 0.8144 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0242 - acc: 1.000 - ETA: 0s - loss: 0.0264 - acc: 0.996 - ETA: 0s - loss: 0.0236 - acc: 0.998 - ETA: 0s - loss: 0.0224 - acc: 0.998 - ETA: 0s - loss: 0.0224 - acc: 0.998 - ETA: 0s - loss: 0.0227 - acc: 0.998 - ETA: 0s - loss: 0.0244 - acc: 0.997 - ETA: 0s - loss: 0.0249 - acc: 0.998 - ETA: 0s - loss: 0.0262 - acc: 0.997 - 0s 75us/step - loss: 0.0262 - acc: 0.9976 - val_loss: 0.6060 - val_acc: 0.8299 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0118 - acc: 1.000 - ETA: 0s - loss: 0.0122 - acc: 1.000 - ETA: 0s - loss: 0.0158 - acc: 0.998 - ETA: 0s - loss: 0.0172 - acc: 0.998 - ETA: 0s - loss: 0.0193 - acc: 0.997 - ETA: 0s - loss: 0.0187 - acc: 0.997 - ETA: 0s - loss: 0.0185 - acc: 0.997 - ETA: 0s - loss: 0.0184 - acc: 0.997 - ETA: 0s - loss: 0.0179 - acc: 0.998 - 1s 75us/step - loss: 0.0187 - acc: 0.9981 - val_loss: 0.6134 - val_acc: 0.8251 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0078 - acc: 1.000 - ETA: 0s - loss: 0.0151 - acc: 0.995 - ETA: 0s - loss: 0.0142 - acc: 0.996 - ETA: 0s - loss: 0.0149 - acc: 0.996 - ETA: 0s - loss: 0.0140 - acc: 0.997 - ETA: 0s - loss: 0.0134 - acc: 0.997 - ETA: 0s - loss: 0.0143 - acc: 0.997 - ETA: 0s - loss: 0.0141 - acc: 0.997 - ETA: 0s - loss: 0.0145 - acc: 0.997 - 1s 75us/step - loss: 0.0148 - acc: 0.9981 - val_loss: 0.6166 - val_acc: 0.8228 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0088 - acc: 1.000 - ETA: 0s - loss: 0.0086 - acc: 1.000 - ETA: 0s - loss: 0.0106 - acc: 0.998 - ETA: 0s - loss: 0.0097 - acc: 0.999 - ETA: 0s - loss: 0.0097 - acc: 0.999 - ETA: 0s - loss: 0.0090 - acc: 0.999 - ETA: 0s - loss: 0.0102 - acc: 0.998 - ETA: 0s - loss: 0.0113 - acc: 0.998 - ETA: 0s - loss: 0.0112 - acc: 0.998 - 0s 75us/step - loss: 0.0116 - acc: 0.9987 - val_loss: 0.6199 - val_acc: 0.8347 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0056 - acc: 1.000 - ETA: 0s - loss: 0.0059 - acc: 0.998 - ETA: 0s - loss: 0.0061 - acc: 0.998 - ETA: 0s - loss: 0.0063 - acc: 0.998 - ETA: 0s - loss: 0.0073 - acc: 0.998 - ETA: 0s - loss: 0.0080 - acc: 0.998 - ETA: 0s - loss: 0.0100 - acc: 0.998 - ETA: 0s - loss: 0.0096 - acc: 0.998 - ETA: 0s - loss: 0.0098 - acc: 0.998 - 0s 74us/step - loss: 0.0099 - acc: 0.9984 - val_loss: 0.6488 - val_acc: 0.8240 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0035 - acc: 1.000 - ETA: 0s - loss: 0.0057 - acc: 0.998 - ETA: 0s - loss: 0.0071 - acc: 0.998 - ETA: 0s - loss: 0.0069 - acc: 0.998 - ETA: 0s - loss: 0.0080 - acc: 0.998 - ETA: 0s - loss: 0.0076 - acc: 0.998 - ETA: 0s - loss: 0.0077 - acc: 0.998 - ETA: 0s - loss: 0.0076 - acc: 0.998 - ETA: 0s - loss: 0.0073 - acc: 0.998 - 1s 75us/step - loss: 0.0087 - acc: 0.9984 - val_loss: 0.6427 - val_acc: 0.8251 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0027 - acc: 1.000 - ETA: 0s - loss: 0.0051 - acc: 0.998 - ETA: 0s - loss: 0.0074 - acc: 0.998 - ETA: 0s - loss: 0.0062 - acc: 0.999 - ETA: 0s - loss: 0.0084 - acc: 0.998 - ETA: 0s - loss: 0.0081 - acc: 0.998 - ETA: 0s - loss: 0.0074 - acc: 0.998 - ETA: 0s - loss: 0.0075 - acc: 0.998 - ETA: 0s - loss: 0.0070 - acc: 0.998 - 0s 75us/step - loss: 0.0069 - acc: 0.9987 - val_loss: 0.6580 - val_acc: 0.8311 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0027 - acc: 1.000 - ETA: 0s - loss: 0.0030 - acc: 0.998 - ETA: 0s - loss: 0.0027 - acc: 0.999 - ETA: 0s - loss: 0.0027 - acc: 0.999 - ETA: 0s - loss: 0.0027 - acc: 0.999 - ETA: 0s - loss: 0.0033 - acc: 0.999 - ETA: 0s - loss: 0.0041 - acc: 0.999 - ETA: 0s - loss: 0.0044 - acc: 0.999 - ETA: 0s - loss: 0.0050 - acc: 0.999 - ETA: 0s - loss: 0.0059 - acc: 0.998 - 1s 76us/step - loss: 0.0058 - acc: 0.9990 - val_loss: 0.6991 - val_acc: 0.8311 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0012 - acc: 1.000 - ETA: 0s - loss: 0.0039 - acc: 0.998 - ETA: 0s - loss: 0.0046 - acc: 0.998 - ETA: 0s - loss: 0.0042 - acc: 0.998 - ETA: 0s - loss: 0.0044 - acc: 0.998 - ETA: 0s - loss: 0.0040 - acc: 0.998 - ETA: 0s - loss: 0.0038 - acc: 0.998 - ETA: 0s - loss: 0.0046 - acc: 0.998 - ETA: 0s - loss: 0.0049 - acc: 0.998 - 1s 75us/step - loss: 0.0051 - acc: 0.9985 - val_loss: 0.6940 - val_acc: 0.8323 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0029 - acc: 1.000 - ETA: 0s - loss: 0.0043 - acc: 0.997 - ETA: 0s - loss: 0.0032 - acc: 0.998 - ETA: 0s - loss: 0.0031 - acc: 0.998 - ETA: 0s - loss: 0.0041 - acc: 0.998 - ETA: 0s - loss: 0.0040 - acc: 0.998 - ETA: 0s - loss: 0.0036 - acc: 0.998 - ETA: 0s - loss: 0.0046 - acc: 0.998 - ETA: 0s - loss: 0.0056 - acc: 0.998 - 0s 75us/step - loss: 0.0054 - acc: 0.9985 - val_loss: 0.6922 - val_acc: 0.8228 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0013 - acc: 1.000 - ETA: 0s - loss: 0.0028 - acc: 0.998 - ETA: 0s - loss: 0.0026 - acc: 0.999 - ETA: 0s - loss: 0.0028 - acc: 0.999 - ETA: 0s - loss: 0.0035 - acc: 0.999 - ETA: 0s - loss: 0.0042 - acc: 0.998 - ETA: 0s - loss: 0.0037 - acc: 0.999 - ETA: 0s - loss: 0.0034 - acc: 0.999 - ETA: 0s - loss: 0.0035 - acc: 0.999 - 0s 75us/step - loss: 0.0043 - acc: 0.9988 - val_loss: 0.7282 - val_acc: 0.8311 Epoch 00020: val_loss did not improve we are at Resnet50_model2 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 38s - loss: 5.3021 - acc: 0.03 - ETA: 4s - loss: 4.8870 - acc: 0.0800 - ETA: 2s - loss: 4.2034 - acc: 0.172 - ETA: 1s - loss: 3.7373 - acc: 0.243 - ETA: 1s - loss: 3.3608 - acc: 0.308 - ETA: 0s - loss: 3.0825 - acc: 0.354 - ETA: 0s - loss: 2.8526 - acc: 0.397 - ETA: 0s - loss: 2.6527 - acc: 0.435 - ETA: 0s - loss: 2.4556 - acc: 0.473 - ETA: 0s - loss: 2.2986 - acc: 0.500 - 1s 171us/step - loss: 2.2861 - acc: 0.5024 - val_loss: 1.1268 - val_acc: 0.7102 Epoch 00001: val_loss improved from inf to 1.12677, saving model to saved_models/weights.best.Resnet502.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 0s - loss: 0.5452 - acc: 0.930 - ETA: 0s - loss: 0.7660 - acc: 0.845 - ETA: 0s - loss: 0.7746 - acc: 0.836 - ETA: 0s - loss: 0.7575 - acc: 0.835 - ETA: 0s - loss: 0.7289 - acc: 0.833 - ETA: 0s - loss: 0.7266 - acc: 0.826 - ETA: 0s - loss: 0.7104 - acc: 0.825 - ETA: 0s - loss: 0.6956 - acc: 0.827 - ETA: 0s - loss: 0.6799 - acc: 0.831 - 1s 75us/step - loss: 0.6636 - acc: 0.8350 - val_loss: 0.7832 - val_acc: 0.7749 Epoch 00002: val_loss improved from 1.12677 to 0.78316, saving model to saved_models/weights.best.Resnet502.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 0s - loss: 0.4391 - acc: 0.910 - ETA: 0s - loss: 0.3772 - acc: 0.915 - ETA: 0s - loss: 0.3749 - acc: 0.913 - ETA: 0s - loss: 0.3813 - acc: 0.909 - ETA: 0s - loss: 0.3782 - acc: 0.908 - ETA: 0s - loss: 0.3753 - acc: 0.907 - ETA: 0s - loss: 0.3712 - acc: 0.908 - ETA: 0s - loss: 0.3641 - acc: 0.910 - ETA: 0s - loss: 0.3649 - acc: 0.909 - 0s 75us/step - loss: 0.3640 - acc: 0.9087 - val_loss: 0.6495 - val_acc: 0.8036 Epoch 00003: val_loss improved from 0.78316 to 0.64948, saving model to saved_models/weights.best.Resnet502.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 0s - loss: 0.3320 - acc: 0.920 - ETA: 0s - loss: 0.2273 - acc: 0.954 - ETA: 0s - loss: 0.2185 - acc: 0.958 - ETA: 0s - loss: 0.2271 - acc: 0.953 - ETA: 0s - loss: 0.2269 - acc: 0.952 - ETA: 0s - loss: 0.2279 - acc: 0.950 - ETA: 0s - loss: 0.2251 - acc: 0.951 - ETA: 0s - loss: 0.2292 - acc: 0.949 - ETA: 0s - loss: 0.2277 - acc: 0.948 - 0s 75us/step - loss: 0.2250 - acc: 0.9487 - val_loss: 0.6117 - val_acc: 0.8144 Epoch 00004: val_loss improved from 0.64948 to 0.61168, saving model to saved_models/weights.best.Resnet502.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 0s - loss: 0.1146 - acc: 1.000 - ETA: 0s - loss: 0.1269 - acc: 0.982 - ETA: 0s - loss: 0.1349 - acc: 0.980 - ETA: 0s - loss: 0.1360 - acc: 0.978 - ETA: 0s - loss: 0.1408 - acc: 0.975 - ETA: 0s - loss: 0.1444 - acc: 0.973 - ETA: 0s - loss: 0.1454 - acc: 0.972 - ETA: 0s - loss: 0.1453 - acc: 0.972 - ETA: 0s - loss: 0.1458 - acc: 0.971 - 1s 76us/step - loss: 0.1468 - acc: 0.9707 - val_loss: 0.6234 - val_acc: 0.7988 Epoch 00005: val_loss did not improve Epoch 6/20 6680/6680 [==============================] - ETA: 0s - loss: 0.1473 - acc: 0.970 - ETA: 0s - loss: 0.0977 - acc: 0.986 - ETA: 0s - loss: 0.0985 - acc: 0.983 - ETA: 0s - loss: 0.0949 - acc: 0.987 - ETA: 0s - loss: 0.0940 - acc: 0.987 - ETA: 0s - loss: 0.0941 - acc: 0.986 - ETA: 0s - loss: 0.0974 - acc: 0.984 - ETA: 0s - loss: 0.0987 - acc: 0.984 - ETA: 0s - loss: 0.0987 - acc: 0.983 - 1s 75us/step - loss: 0.0981 - acc: 0.9832 - val_loss: 0.5861 - val_acc: 0.8168 Epoch 00006: val_loss improved from 0.61168 to 0.58614, saving model to saved_models/weights.best.Resnet502.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0491 - acc: 1.000 - ETA: 0s - loss: 0.0543 - acc: 0.997 - ETA: 0s - loss: 0.0568 - acc: 0.996 - ETA: 0s - loss: 0.0589 - acc: 0.994 - ETA: 0s - loss: 0.0595 - acc: 0.994 - ETA: 0s - loss: 0.0621 - acc: 0.993 - ETA: 0s - loss: 0.0665 - acc: 0.992 - ETA: 0s - loss: 0.0655 - acc: 0.992 - ETA: 0s - loss: 0.0683 - acc: 0.991 - 0s 74us/step - loss: 0.0696 - acc: 0.9901 - val_loss: 0.5800 - val_acc: 0.8228 Epoch 00007: val_loss improved from 0.58614 to 0.58002, saving model to saved_models/weights.best.Resnet502.hdf5 Epoch 8/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0350 - acc: 1.000 - ETA: 0s - loss: 0.0341 - acc: 0.998 - ETA: 0s - loss: 0.0397 - acc: 0.996 - ETA: 0s - loss: 0.0393 - acc: 0.997 - ETA: 0s - loss: 0.0421 - acc: 0.996 - ETA: 0s - loss: 0.0422 - acc: 0.996 - ETA: 0s - loss: 0.0459 - acc: 0.994 - ETA: 0s - loss: 0.0472 - acc: 0.994 - ETA: 0s - loss: 0.0478 - acc: 0.993 - 0s 74us/step - loss: 0.0479 - acc: 0.9936 - val_loss: 0.5928 - val_acc: 0.8192 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0223 - acc: 1.000 - ETA: 0s - loss: 0.0309 - acc: 0.997 - ETA: 0s - loss: 0.0306 - acc: 0.997 - ETA: 0s - loss: 0.0306 - acc: 0.997 - ETA: 0s - loss: 0.0311 - acc: 0.997 - ETA: 0s - loss: 0.0339 - acc: 0.996 - ETA: 0s - loss: 0.0337 - acc: 0.996 - ETA: 0s - loss: 0.0335 - acc: 0.997 - ETA: 0s - loss: 0.0354 - acc: 0.996 - 0s 75us/step - loss: 0.0355 - acc: 0.9963 - val_loss: 0.5949 - val_acc: 0.8204 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0235 - acc: 1.000 - ETA: 0s - loss: 0.0202 - acc: 0.997 - ETA: 0s - loss: 0.0192 - acc: 0.998 - ETA: 0s - loss: 0.0202 - acc: 0.997 - ETA: 0s - loss: 0.0224 - acc: 0.997 - ETA: 0s - loss: 0.0235 - acc: 0.997 - ETA: 0s - loss: 0.0255 - acc: 0.996 - ETA: 0s - loss: 0.0253 - acc: 0.997 - ETA: 0s - loss: 0.0256 - acc: 0.997 - 0s 74us/step - loss: 0.0253 - acc: 0.9973 - val_loss: 0.5880 - val_acc: 0.8299 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0191 - acc: 1.000 - ETA: 0s - loss: 0.0213 - acc: 0.998 - ETA: 0s - loss: 0.0175 - acc: 0.999 - ETA: 0s - loss: 0.0168 - acc: 0.999 - ETA: 0s - loss: 0.0180 - acc: 0.998 - ETA: 0s - loss: 0.0192 - acc: 0.998 - ETA: 0s - loss: 0.0195 - acc: 0.998 - ETA: 0s - loss: 0.0193 - acc: 0.998 - ETA: 0s - loss: 0.0203 - acc: 0.997 - 0s 75us/step - loss: 0.0199 - acc: 0.9981 - val_loss: 0.5878 - val_acc: 0.8395 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0120 - acc: 1.000 - ETA: 0s - loss: 0.0092 - acc: 1.000 - ETA: 0s - loss: 0.0101 - acc: 1.000 - ETA: 0s - loss: 0.0111 - acc: 0.999 - ETA: 0s - loss: 0.0130 - acc: 0.998 - ETA: 0s - loss: 0.0140 - acc: 0.998 - ETA: 0s - loss: 0.0146 - acc: 0.998 - ETA: 0s - loss: 0.0161 - acc: 0.998 - ETA: 0s - loss: 0.0161 - acc: 0.998 - 1s 75us/step - loss: 0.0157 - acc: 0.9982 - val_loss: 0.5789 - val_acc: 0.8240 Epoch 00012: val_loss improved from 0.58002 to 0.57893, saving model to saved_models/weights.best.Resnet502.hdf5 Epoch 13/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0053 - acc: 1.000 - ETA: 0s - loss: 0.0076 - acc: 0.998 - ETA: 0s - loss: 0.0091 - acc: 0.998 - ETA: 0s - loss: 0.0096 - acc: 0.998 - ETA: 0s - loss: 0.0099 - acc: 0.998 - ETA: 0s - loss: 0.0099 - acc: 0.998 - ETA: 0s - loss: 0.0101 - acc: 0.998 - ETA: 0s - loss: 0.0107 - acc: 0.998 - ETA: 0s - loss: 0.0113 - acc: 0.998 - 1s 75us/step - loss: 0.0121 - acc: 0.9979 - val_loss: 0.6233 - val_acc: 0.8335 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0044 - acc: 1.000 - ETA: 0s - loss: 0.0105 - acc: 0.996 - ETA: 0s - loss: 0.0086 - acc: 0.998 - ETA: 0s - loss: 0.0094 - acc: 0.998 - ETA: 0s - loss: 0.0104 - acc: 0.997 - ETA: 0s - loss: 0.0105 - acc: 0.997 - ETA: 0s - loss: 0.0099 - acc: 0.998 - ETA: 0s - loss: 0.0099 - acc: 0.998 - ETA: 0s - loss: 0.0103 - acc: 0.998 - 0s 74us/step - loss: 0.0102 - acc: 0.9984 - val_loss: 0.6156 - val_acc: 0.8359 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0081 - acc: 1.000 - ETA: 0s - loss: 0.0041 - acc: 1.000 - ETA: 0s - loss: 0.0057 - acc: 0.998 - ETA: 0s - loss: 0.0065 - acc: 0.998 - ETA: 0s - loss: 0.0078 - acc: 0.998 - ETA: 0s - loss: 0.0079 - acc: 0.998 - ETA: 0s - loss: 0.0076 - acc: 0.998 - ETA: 0s - loss: 0.0085 - acc: 0.997 - ETA: 0s - loss: 0.0095 - acc: 0.997 - ETA: 0s - loss: 0.0089 - acc: 0.997 - 1s 76us/step - loss: 0.0089 - acc: 0.9979 - val_loss: 0.6275 - val_acc: 0.8371 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0034 - acc: 1.000 - ETA: 0s - loss: 0.0050 - acc: 1.000 - ETA: 0s - loss: 0.0065 - acc: 0.998 - ETA: 0s - loss: 0.0059 - acc: 0.998 - ETA: 0s - loss: 0.0079 - acc: 0.998 - ETA: 0s - loss: 0.0071 - acc: 0.998 - ETA: 0s - loss: 0.0072 - acc: 0.998 - ETA: 0s - loss: 0.0082 - acc: 0.997 - ETA: 0s - loss: 0.0077 - acc: 0.998 - 0s 75us/step - loss: 0.0073 - acc: 0.9982 - val_loss: 0.6167 - val_acc: 0.8371 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0050 - acc: 1.000 - ETA: 0s - loss: 0.0031 - acc: 1.000 - ETA: 0s - loss: 0.0028 - acc: 1.000 - ETA: 0s - loss: 0.0029 - acc: 1.000 - ETA: 0s - loss: 0.0038 - acc: 0.999 - ETA: 0s - loss: 0.0051 - acc: 0.999 - ETA: 0s - loss: 0.0047 - acc: 0.999 - ETA: 0s - loss: 0.0047 - acc: 0.999 - ETA: 0s - loss: 0.0054 - acc: 0.998 - 1s 75us/step - loss: 0.0063 - acc: 0.9985 - val_loss: 0.6762 - val_acc: 0.8323 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0024 - acc: 1.000 - ETA: 0s - loss: 0.0034 - acc: 0.998 - ETA: 0s - loss: 0.0031 - acc: 0.999 - ETA: 0s - loss: 0.0027 - acc: 0.999 - ETA: 0s - loss: 0.0028 - acc: 0.999 - ETA: 0s - loss: 0.0029 - acc: 0.999 - ETA: 0s - loss: 0.0039 - acc: 0.998 - ETA: 0s - loss: 0.0057 - acc: 0.998 - ETA: 0s - loss: 0.0055 - acc: 0.998 - 1s 76us/step - loss: 0.0056 - acc: 0.9987 - val_loss: 0.6639 - val_acc: 0.8383 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0295 - acc: 0.990 - ETA: 0s - loss: 0.0076 - acc: 0.996 - ETA: 0s - loss: 0.0049 - acc: 0.998 - ETA: 0s - loss: 0.0039 - acc: 0.998 - ETA: 0s - loss: 0.0050 - acc: 0.998 - ETA: 0s - loss: 0.0044 - acc: 0.998 - ETA: 0s - loss: 0.0042 - acc: 0.998 - ETA: 0s - loss: 0.0044 - acc: 0.998 - ETA: 0s - loss: 0.0055 - acc: 0.998 - 0s 75us/step - loss: 0.0060 - acc: 0.9984 - val_loss: 0.6988 - val_acc: 0.8299 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0084 - acc: 0.990 - ETA: 0s - loss: 0.0022 - acc: 0.998 - ETA: 0s - loss: 0.0028 - acc: 0.999 - ETA: 0s - loss: 0.0030 - acc: 0.999 - ETA: 0s - loss: 0.0026 - acc: 0.999 - ETA: 0s - loss: 0.0024 - acc: 0.999 - ETA: 0s - loss: 0.0023 - acc: 0.999 - ETA: 0s - loss: 0.0031 - acc: 0.999 - ETA: 0s - loss: 0.0036 - acc: 0.998 - 0s 75us/step - loss: 0.0043 - acc: 0.9987 - val_loss: 0.7151 - val_acc: 0.8311 Epoch 00020: val_loss did not improve we are at Resnet50_model3 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 39s - loss: 5.5114 - acc: 0.01 - ETA: 4s - loss: 4.8123 - acc: 0.0725 - ETA: 2s - loss: 4.1435 - acc: 0.171 - ETA: 1s - loss: 3.6736 - acc: 0.247 - ETA: 1s - loss: 3.3199 - acc: 0.307 - ETA: 0s - loss: 3.0560 - acc: 0.353 - ETA: 0s - loss: 2.8378 - acc: 0.392 - ETA: 0s - loss: 2.6530 - acc: 0.426 - ETA: 0s - loss: 2.4839 - acc: 0.456 - ETA: 0s - loss: 2.3483 - acc: 0.481 - 1s 176us/step - loss: 2.2954 - acc: 0.4927 - val_loss: 1.1581 - val_acc: 0.6922 Epoch 00001: val_loss improved from inf to 1.15807, saving model to saved_models/weights.best.Resnet503.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 0s - loss: 0.8435 - acc: 0.810 - ETA: 0s - loss: 0.7651 - acc: 0.833 - ETA: 0s - loss: 0.7445 - acc: 0.831 - ETA: 0s - loss: 0.7482 - acc: 0.820 - ETA: 0s - loss: 0.7436 - acc: 0.817 - ETA: 0s - loss: 0.7276 - acc: 0.820 - ETA: 0s - loss: 0.7175 - acc: 0.821 - ETA: 0s - loss: 0.7003 - acc: 0.825 - ETA: 0s - loss: 0.6819 - acc: 0.828 - ETA: 0s - loss: 0.6723 - acc: 0.829 - 1s 76us/step - loss: 0.6674 - acc: 0.8307 - val_loss: 0.7951 - val_acc: 0.7545 Epoch 00002: val_loss improved from 1.15807 to 0.79512, saving model to saved_models/weights.best.Resnet503.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 0s - loss: 0.3714 - acc: 0.910 - ETA: 0s - loss: 0.3694 - acc: 0.921 - ETA: 0s - loss: 0.3648 - acc: 0.922 - ETA: 0s - loss: 0.3560 - acc: 0.920 - ETA: 0s - loss: 0.3570 - acc: 0.917 - ETA: 0s - loss: 0.3601 - acc: 0.914 - ETA: 0s - loss: 0.3619 - acc: 0.912 - ETA: 0s - loss: 0.3672 - acc: 0.908 - ETA: 0s - loss: 0.3678 - acc: 0.909 - ETA: 0s - loss: 0.3630 - acc: 0.910 - 1s 77us/step - loss: 0.3635 - acc: 0.9099 - val_loss: 0.6649 - val_acc: 0.7760 Epoch 00003: val_loss improved from 0.79512 to 0.66495, saving model to saved_models/weights.best.Resnet503.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 0s - loss: 0.2166 - acc: 0.970 - ETA: 0s - loss: 0.2091 - acc: 0.967 - ETA: 0s - loss: 0.2363 - acc: 0.951 - ETA: 0s - loss: 0.2358 - acc: 0.949 - ETA: 0s - loss: 0.2334 - acc: 0.948 - ETA: 0s - loss: 0.2324 - acc: 0.947 - ETA: 0s - loss: 0.2314 - acc: 0.947 - ETA: 0s - loss: 0.2314 - acc: 0.947 - ETA: 0s - loss: 0.2300 - acc: 0.947 - 0s 75us/step - loss: 0.2269 - acc: 0.9481 - val_loss: 0.6414 - val_acc: 0.7940 Epoch 00004: val_loss improved from 0.66495 to 0.64140, saving model to saved_models/weights.best.Resnet503.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 0s - loss: 0.1298 - acc: 0.980 - ETA: 0s - loss: 0.1420 - acc: 0.976 - ETA: 0s - loss: 0.1399 - acc: 0.974 - ETA: 0s - loss: 0.1402 - acc: 0.975 - ETA: 0s - loss: 0.1390 - acc: 0.974 - ETA: 0s - loss: 0.1403 - acc: 0.974 - ETA: 0s - loss: 0.1391 - acc: 0.974 - ETA: 0s - loss: 0.1455 - acc: 0.971 - ETA: 0s - loss: 0.1474 - acc: 0.970 - 1s 75us/step - loss: 0.1501 - acc: 0.9692 - val_loss: 0.6388 - val_acc: 0.7904 Epoch 00005: val_loss improved from 0.64140 to 0.63876, saving model to saved_models/weights.best.Resnet503.hdf5 Epoch 6/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0891 - acc: 0.980 - ETA: 0s - loss: 0.0902 - acc: 0.983 - ETA: 0s - loss: 0.0986 - acc: 0.983 - ETA: 0s - loss: 0.0977 - acc: 0.983 - ETA: 0s - loss: 0.0923 - acc: 0.985 - ETA: 0s - loss: 0.0947 - acc: 0.984 - ETA: 0s - loss: 0.0973 - acc: 0.983 - ETA: 0s - loss: 0.0985 - acc: 0.982 - ETA: 0s - loss: 0.0992 - acc: 0.982 - ETA: 0s - loss: 0.1005 - acc: 0.981 - 1s 76us/step - loss: 0.1003 - acc: 0.9817 - val_loss: 0.6308 - val_acc: 0.8036 Epoch 00006: val_loss improved from 0.63876 to 0.63080, saving model to saved_models/weights.best.Resnet503.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0360 - acc: 1.000 - ETA: 0s - loss: 0.0578 - acc: 0.993 - ETA: 0s - loss: 0.0626 - acc: 0.992 - ETA: 0s - loss: 0.0645 - acc: 0.992 - ETA: 0s - loss: 0.0645 - acc: 0.991 - ETA: 0s - loss: 0.0663 - acc: 0.989 - ETA: 0s - loss: 0.0654 - acc: 0.990 - ETA: 0s - loss: 0.0651 - acc: 0.990 - ETA: 0s - loss: 0.0669 - acc: 0.990 - ETA: 0s - loss: 0.0671 - acc: 0.990 - 1s 77us/step - loss: 0.0672 - acc: 0.9898 - val_loss: 0.5876 - val_acc: 0.8192 Epoch 00007: val_loss improved from 0.63080 to 0.58763, saving model to saved_models/weights.best.Resnet503.hdf5 Epoch 8/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0323 - acc: 1.000 - ETA: 0s - loss: 0.0367 - acc: 0.997 - ETA: 0s - loss: 0.0376 - acc: 0.997 - ETA: 0s - loss: 0.0393 - acc: 0.997 - ETA: 0s - loss: 0.0407 - acc: 0.996 - ETA: 0s - loss: 0.0430 - acc: 0.996 - ETA: 0s - loss: 0.0447 - acc: 0.995 - ETA: 0s - loss: 0.0450 - acc: 0.995 - ETA: 0s - loss: 0.0471 - acc: 0.994 - ETA: 0s - loss: 0.0494 - acc: 0.993 - 1s 77us/step - loss: 0.0494 - acc: 0.9939 - val_loss: 0.5917 - val_acc: 0.8132 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0184 - acc: 1.000 - ETA: 0s - loss: 0.0267 - acc: 1.000 - ETA: 0s - loss: 0.0271 - acc: 0.999 - ETA: 0s - loss: 0.0303 - acc: 0.998 - ETA: 0s - loss: 0.0302 - acc: 0.998 - ETA: 0s - loss: 0.0303 - acc: 0.998 - ETA: 0s - loss: 0.0314 - acc: 0.998 - ETA: 0s - loss: 0.0335 - acc: 0.997 - ETA: 0s - loss: 0.0350 - acc: 0.996 - 1s 76us/step - loss: 0.0357 - acc: 0.9964 - val_loss: 0.6161 - val_acc: 0.8168 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0162 - acc: 1.000 - ETA: 0s - loss: 0.0189 - acc: 0.998 - ETA: 0s - loss: 0.0208 - acc: 0.998 - ETA: 0s - loss: 0.0207 - acc: 0.999 - ETA: 0s - loss: 0.0217 - acc: 0.998 - ETA: 0s - loss: 0.0210 - acc: 0.998 - ETA: 0s - loss: 0.0242 - acc: 0.997 - ETA: 0s - loss: 0.0248 - acc: 0.997 - ETA: 0s - loss: 0.0253 - acc: 0.997 - ETA: 0s - loss: 0.0259 - acc: 0.997 - 1s 76us/step - loss: 0.0259 - acc: 0.9973 - val_loss: 0.5820 - val_acc: 0.8228 Epoch 00010: val_loss improved from 0.58763 to 0.58201, saving model to saved_models/weights.best.Resnet503.hdf5 Epoch 11/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0146 - acc: 1.000 - ETA: 0s - loss: 0.0154 - acc: 1.000 - ETA: 0s - loss: 0.0146 - acc: 0.999 - ETA: 0s - loss: 0.0150 - acc: 0.999 - ETA: 0s - loss: 0.0147 - acc: 0.999 - ETA: 0s - loss: 0.0163 - acc: 0.999 - ETA: 0s - loss: 0.0172 - acc: 0.999 - ETA: 0s - loss: 0.0183 - acc: 0.998 - ETA: 0s - loss: 0.0180 - acc: 0.998 - ETA: 0s - loss: 0.0191 - acc: 0.998 - 1s 76us/step - loss: 0.0191 - acc: 0.9984 - val_loss: 0.5974 - val_acc: 0.8216 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0081 - acc: 1.000 - ETA: 0s - loss: 0.0091 - acc: 1.000 - ETA: 0s - loss: 0.0097 - acc: 1.000 - ETA: 0s - loss: 0.0107 - acc: 0.999 - ETA: 0s - loss: 0.0124 - acc: 0.998 - ETA: 0s - loss: 0.0131 - acc: 0.998 - ETA: 0s - loss: 0.0131 - acc: 0.998 - ETA: 0s - loss: 0.0144 - acc: 0.998 - ETA: 0s - loss: 0.0147 - acc: 0.998 - ETA: 0s - loss: 0.0149 - acc: 0.998 - 1s 76us/step - loss: 0.0150 - acc: 0.9981 - val_loss: 0.6262 - val_acc: 0.8204 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0230 - acc: 0.990 - ETA: 0s - loss: 0.0133 - acc: 0.997 - ETA: 0s - loss: 0.0107 - acc: 0.998 - ETA: 0s - loss: 0.0100 - acc: 0.998 - ETA: 0s - loss: 0.0103 - acc: 0.998 - ETA: 0s - loss: 0.0123 - acc: 0.998 - ETA: 0s - loss: 0.0120 - acc: 0.998 - ETA: 0s - loss: 0.0120 - acc: 0.998 - ETA: 0s - loss: 0.0117 - acc: 0.998 - ETA: 0s - loss: 0.0121 - acc: 0.998 - 1s 77us/step - loss: 0.0121 - acc: 0.9981 - val_loss: 0.6039 - val_acc: 0.8299 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0096 - acc: 1.000 - ETA: 0s - loss: 0.0086 - acc: 0.998 - ETA: 0s - loss: 0.0070 - acc: 0.999 - ETA: 0s - loss: 0.0104 - acc: 0.998 - ETA: 0s - loss: 0.0107 - acc: 0.998 - ETA: 0s - loss: 0.0104 - acc: 0.998 - ETA: 0s - loss: 0.0101 - acc: 0.998 - ETA: 0s - loss: 0.0096 - acc: 0.998 - ETA: 0s - loss: 0.0092 - acc: 0.998 - ETA: 0s - loss: 0.0095 - acc: 0.998 - 1s 76us/step - loss: 0.0094 - acc: 0.9987 - val_loss: 0.6525 - val_acc: 0.8120 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0040 - acc: 1.000 - ETA: 0s - loss: 0.0058 - acc: 0.998 - ETA: 0s - loss: 0.0062 - acc: 0.998 - ETA: 0s - loss: 0.0059 - acc: 0.999 - ETA: 0s - loss: 0.0055 - acc: 0.999 - ETA: 0s - loss: 0.0062 - acc: 0.999 - ETA: 0s - loss: 0.0073 - acc: 0.998 - ETA: 0s - loss: 0.0068 - acc: 0.999 - ETA: 0s - loss: 0.0072 - acc: 0.998 - ETA: 0s - loss: 0.0088 - acc: 0.998 - 1s 77us/step - loss: 0.0090 - acc: 0.9982 - val_loss: 0.6665 - val_acc: 0.8192 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0040 - acc: 1.000 - ETA: 0s - loss: 0.0031 - acc: 1.000 - ETA: 0s - loss: 0.0029 - acc: 1.000 - ETA: 0s - loss: 0.0057 - acc: 0.998 - ETA: 0s - loss: 0.0058 - acc: 0.999 - ETA: 0s - loss: 0.0059 - acc: 0.998 - ETA: 0s - loss: 0.0069 - acc: 0.998 - ETA: 0s - loss: 0.0064 - acc: 0.998 - ETA: 0s - loss: 0.0074 - acc: 0.998 - ETA: 0s - loss: 0.0071 - acc: 0.998 - 1s 78us/step - loss: 0.0069 - acc: 0.9985 - val_loss: 0.6530 - val_acc: 0.8251 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0034 - acc: 1.000 - ETA: 0s - loss: 0.0021 - acc: 1.000 - ETA: 0s - loss: 0.0020 - acc: 1.000 - ETA: 0s - loss: 0.0028 - acc: 1.000 - ETA: 0s - loss: 0.0046 - acc: 0.999 - ETA: 0s - loss: 0.0048 - acc: 0.999 - ETA: 0s - loss: 0.0044 - acc: 0.999 - ETA: 0s - loss: 0.0056 - acc: 0.999 - ETA: 0s - loss: 0.0057 - acc: 0.999 - 0s 75us/step - loss: 0.0063 - acc: 0.9988 - val_loss: 0.6822 - val_acc: 0.8311 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0083 - acc: 1.000 - ETA: 0s - loss: 0.0048 - acc: 0.998 - ETA: 0s - loss: 0.0041 - acc: 0.998 - ETA: 0s - loss: 0.0044 - acc: 0.998 - ETA: 0s - loss: 0.0042 - acc: 0.998 - ETA: 0s - loss: 0.0048 - acc: 0.998 - ETA: 0s - loss: 0.0044 - acc: 0.998 - ETA: 0s - loss: 0.0042 - acc: 0.998 - ETA: 0s - loss: 0.0051 - acc: 0.998 - 1s 76us/step - loss: 0.0055 - acc: 0.9985 - val_loss: 0.6748 - val_acc: 0.8311 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 0s - loss: 0.0077 - acc: 1.000 - ETA: 0s - loss: 0.0034 - acc: 0.998 - ETA: 0s - loss: 0.0055 - acc: 0.998 - ETA: 0s - loss: 0.0064 - acc: 0.997 - ETA: 0s - loss: 0.0052 - acc: 0.998 - ETA: 0s - loss: 0.0044 - acc: 0.998 - ETA: 0s - loss: 0.0042 - acc: 0.998 - ETA: 0s - loss: 0.0038 - acc: 0.998 - ETA: 0s - loss: 0.0049 - acc: 0.998 - 0s 74us/step - loss: 0.0053 - acc: 0.9984 - val_loss: 0.7001 - val_acc: 0.8263 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 0s - loss: 9.7868e-04 - acc: 1.000 - ETA: 0s - loss: 0.0023 - acc: 1.0000 - ETA: 0s - loss: 0.0021 - acc: 1.000 - ETA: 0s - loss: 0.0025 - acc: 0.999 - ETA: 0s - loss: 0.0043 - acc: 0.999 - ETA: 0s - loss: 0.0037 - acc: 0.999 - ETA: 0s - loss: 0.0033 - acc: 0.999 - ETA: 0s - loss: 0.0036 - acc: 0.999 - ETA: 0s - loss: 0.0050 - acc: 0.998 - 1s 75us/step - loss: 0.0047 - acc: 0.9987 - val_loss: 0.7348 - val_acc: 0.8347 Epoch 00020: val_loss did not improve we are at Xception_model Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 1:00 - loss: 5.0581 - acc: 0.0000e+0 - ETA: 36s - loss: 8.0866 - acc: 0.1150 - ETA: 29s - loss: 9.5076 - acc: 0.13 - ETA: 24s - loss: 10.0796 - acc: 0.167 - ETA: 22s - loss: 10.3888 - acc: 0.184 - ETA: 20s - loss: 10.5221 - acc: 0.198 - ETA: 18s - loss: 10.8110 - acc: 0.195 - ETA: 17s - loss: 10.7169 - acc: 0.207 - ETA: 16s - loss: 10.6446 - acc: 0.223 - ETA: 16s - loss: 10.6312 - acc: 0.231 - ETA: 15s - loss: 10.6143 - acc: 0.239 - ETA: 15s - loss: 10.5736 - acc: 0.246 - ETA: 14s - loss: 10.5387 - acc: 0.253 - ETA: 13s - loss: 10.5750 - acc: 0.255 - ETA: 13s - loss: 10.6122 - acc: 0.258 - ETA: 13s - loss: 10.5849 - acc: 0.263 - ETA: 12s - loss: 10.5002 - acc: 0.271 - ETA: 12s - loss: 10.4950 - acc: 0.273 - ETA: 11s - loss: 10.4236 - acc: 0.280 - ETA: 11s - loss: 10.3785 - acc: 0.284 - ETA: 11s - loss: 10.2854 - acc: 0.292 - ETA: 11s - loss: 10.3143 - acc: 0.291 - ETA: 10s - loss: 10.2906 - acc: 0.295 - ETA: 10s - loss: 10.2631 - acc: 0.298 - ETA: 10s - loss: 10.2508 - acc: 0.300 - ETA: 9s - loss: 10.2682 - acc: 0.301 - ETA: 9s - loss: 10.2711 - acc: 0.30 - ETA: 9s - loss: 10.2672 - acc: 0.30 - ETA: 8s - loss: 10.2640 - acc: 0.30 - ETA: 8s - loss: 10.3054 - acc: 0.30 - ETA: 8s - loss: 10.2772 - acc: 0.30 - ETA: 8s - loss: 10.2521 - acc: 0.30 - ETA: 7s - loss: 10.2344 - acc: 0.30 - ETA: 7s - loss: 10.1631 - acc: 0.31 - ETA: 7s - loss: 10.1202 - acc: 0.31 - ETA: 7s - loss: 10.0772 - acc: 0.32 - ETA: 6s - loss: 10.0953 - acc: 0.31 - ETA: 6s - loss: 10.0553 - acc: 0.32 - ETA: 6s - loss: 10.0831 - acc: 0.32 - ETA: 6s - loss: 10.0754 - acc: 0.32 - ETA: 5s - loss: 10.0667 - acc: 0.32 - ETA: 5s - loss: 10.0444 - acc: 0.32 - ETA: 5s - loss: 10.0156 - acc: 0.32 - ETA: 5s - loss: 10.0335 - acc: 0.32 - ETA: 4s - loss: 10.0058 - acc: 0.32 - ETA: 4s - loss: 10.0013 - acc: 0.33 - ETA: 4s - loss: 10.0023 - acc: 0.33 - ETA: 4s - loss: 10.0319 - acc: 0.32 - ETA: 4s - loss: 10.0135 - acc: 0.33 - ETA: 3s - loss: 10.0241 - acc: 0.33 - ETA: 3s - loss: 10.0168 - acc: 0.33 - ETA: 3s - loss: 9.9693 - acc: 0.3346 - ETA: 3s - loss: 9.9758 - acc: 0.334 - ETA: 2s - loss: 9.9783 - acc: 0.334 - ETA: 2s - loss: 9.9524 - acc: 0.336 - ETA: 2s - loss: 9.9541 - acc: 0.337 - ETA: 2s - loss: 9.9471 - acc: 0.338 - ETA: 1s - loss: 9.9446 - acc: 0.339 - ETA: 1s - loss: 9.9163 - acc: 0.341 - ETA: 1s - loss: 9.9299 - acc: 0.340 - ETA: 1s - loss: 9.9268 - acc: 0.341 - ETA: 1s - loss: 9.9197 - acc: 0.341 - ETA: 0s - loss: 9.9099 - acc: 0.342 - ETA: 0s - loss: 9.9071 - acc: 0.343 - ETA: 0s - loss: 9.8994 - acc: 0.344 - ETA: 0s - loss: 9.8799 - acc: 0.345 - 15s 2ms/step - loss: 9.8738 - acc: 0.3464 - val_loss: 9.3883 - val_acc: 0.3976 Epoch 00001: val_loss improved from inf to 9.38835, saving model to saved_models/weights.best.Xception.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 12s - loss: 8.1598 - acc: 0.47 - ETA: 12s - loss: 9.0114 - acc: 0.42 - ETA: 12s - loss: 8.5219 - acc: 0.46 - ETA: 12s - loss: 8.6355 - acc: 0.45 - ETA: 11s - loss: 8.5874 - acc: 0.45 - ETA: 11s - loss: 8.5833 - acc: 0.45 - ETA: 11s - loss: 8.6430 - acc: 0.44 - ETA: 11s - loss: 8.6784 - acc: 0.44 - ETA: 11s - loss: 8.6160 - acc: 0.44 - ETA: 10s - loss: 8.6558 - acc: 0.44 - ETA: 10s - loss: 8.5445 - acc: 0.45 - ETA: 10s - loss: 8.5806 - acc: 0.45 - ETA: 10s - loss: 8.6811 - acc: 0.44 - ETA: 10s - loss: 8.7058 - acc: 0.44 - ETA: 9s - loss: 8.8345 - acc: 0.4353 - ETA: 9s - loss: 8.8425 - acc: 0.433 - ETA: 9s - loss: 8.8317 - acc: 0.434 - ETA: 9s - loss: 8.8584 - acc: 0.432 - ETA: 9s - loss: 8.8061 - acc: 0.435 - ETA: 9s - loss: 8.8414 - acc: 0.434 - ETA: 8s - loss: 8.8978 - acc: 0.431 - ETA: 8s - loss: 8.9457 - acc: 0.428 - ETA: 8s - loss: 8.9304 - acc: 0.429 - ETA: 8s - loss: 8.9450 - acc: 0.426 - ETA: 8s - loss: 8.9972 - acc: 0.423 - ETA: 7s - loss: 9.0255 - acc: 0.421 - ETA: 7s - loss: 9.0237 - acc: 0.421 - ETA: 7s - loss: 8.9784 - acc: 0.425 - ETA: 7s - loss: 8.9866 - acc: 0.424 - ETA: 7s - loss: 8.9492 - acc: 0.427 - ETA: 6s - loss: 8.9223 - acc: 0.427 - ETA: 6s - loss: 8.9694 - acc: 0.424 - ETA: 6s - loss: 8.9409 - acc: 0.426 - ETA: 6s - loss: 8.9454 - acc: 0.426 - ETA: 6s - loss: 8.9640 - acc: 0.425 - ETA: 5s - loss: 8.9972 - acc: 0.423 - ETA: 5s - loss: 9.0099 - acc: 0.423 - ETA: 5s - loss: 8.9709 - acc: 0.425 - ETA: 5s - loss: 8.9683 - acc: 0.424 - ETA: 5s - loss: 8.9694 - acc: 0.424 - ETA: 5s - loss: 8.9955 - acc: 0.423 - ETA: 4s - loss: 8.9573 - acc: 0.426 - ETA: 4s - loss: 8.9698 - acc: 0.425 - ETA: 4s - loss: 8.9751 - acc: 0.424 - ETA: 4s - loss: 8.9909 - acc: 0.423 - ETA: 4s - loss: 8.9885 - acc: 0.423 - ETA: 3s - loss: 9.0008 - acc: 0.423 - ETA: 3s - loss: 9.0003 - acc: 0.422 - ETA: 3s - loss: 8.9925 - acc: 0.423 - ETA: 3s - loss: 8.9966 - acc: 0.423 - ETA: 3s - loss: 9.0033 - acc: 0.422 - ETA: 2s - loss: 9.0131 - acc: 0.422 - ETA: 2s - loss: 8.9910 - acc: 0.423 - ETA: 2s - loss: 8.9688 - acc: 0.425 - ETA: 2s - loss: 8.9739 - acc: 0.424 - ETA: 2s - loss: 8.9684 - acc: 0.425 - ETA: 1s - loss: 8.9700 - acc: 0.425 - ETA: 1s - loss: 8.9670 - acc: 0.424 - ETA: 1s - loss: 8.9415 - acc: 0.426 - ETA: 1s - loss: 8.9404 - acc: 0.426 - ETA: 1s - loss: 8.9074 - acc: 0.428 - ETA: 0s - loss: 8.9071 - acc: 0.428 - ETA: 0s - loss: 8.9067 - acc: 0.428 - ETA: 0s - loss: 8.9087 - acc: 0.427 - ETA: 0s - loss: 8.9041 - acc: 0.428 - ETA: 0s - loss: 8.9063 - acc: 0.428 - 13s 2ms/step - loss: 8.9089 - acc: 0.4280 - val_loss: 8.9788 - val_acc: 0.4216 Epoch 00002: val_loss improved from 9.38835 to 8.97877, saving model to saved_models/weights.best.Xception.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 12s - loss: 9.4169 - acc: 0.40 - ETA: 12s - loss: 8.4722 - acc: 0.45 - ETA: 12s - loss: 8.6388 - acc: 0.44 - ETA: 11s - loss: 9.0758 - acc: 0.42 - ETA: 11s - loss: 8.9738 - acc: 0.42 - ETA: 11s - loss: 8.8891 - acc: 0.43 - ETA: 11s - loss: 8.8525 - acc: 0.43 - ETA: 11s - loss: 8.8023 - acc: 0.43 - ETA: 10s - loss: 8.8015 - acc: 0.43 - ETA: 10s - loss: 8.8645 - acc: 0.43 - ETA: 10s - loss: 8.8130 - acc: 0.43 - ETA: 10s - loss: 8.7898 - acc: 0.43 - ETA: 10s - loss: 8.7236 - acc: 0.44 - ETA: 10s - loss: 8.7532 - acc: 0.44 - ETA: 9s - loss: 8.7603 - acc: 0.4393 - ETA: 9s - loss: 8.8610 - acc: 0.433 - ETA: 9s - loss: 8.8496 - acc: 0.434 - ETA: 9s - loss: 8.8058 - acc: 0.437 - ETA: 9s - loss: 8.7347 - acc: 0.442 - ETA: 8s - loss: 8.7042 - acc: 0.444 - ETA: 8s - loss: 8.6513 - acc: 0.446 - ETA: 8s - loss: 8.6302 - acc: 0.448 - ETA: 8s - loss: 8.6544 - acc: 0.447 - ETA: 8s - loss: 8.6670 - acc: 0.447 - ETA: 7s - loss: 8.6485 - acc: 0.447 - ETA: 7s - loss: 8.5975 - acc: 0.450 - ETA: 7s - loss: 8.6137 - acc: 0.450 - ETA: 7s - loss: 8.6199 - acc: 0.448 - ETA: 7s - loss: 8.6550 - acc: 0.446 - ETA: 7s - loss: 8.6776 - acc: 0.444 - ETA: 6s - loss: 8.6795 - acc: 0.444 - ETA: 6s - loss: 8.6572 - acc: 0.445 - ETA: 6s - loss: 8.6519 - acc: 0.446 - ETA: 6s - loss: 8.6324 - acc: 0.447 - ETA: 6s - loss: 8.6323 - acc: 0.448 - ETA: 5s - loss: 8.5919 - acc: 0.450 - ETA: 5s - loss: 8.5620 - acc: 0.452 - ETA: 5s - loss: 8.5778 - acc: 0.451 - ETA: 5s - loss: 8.5687 - acc: 0.452 - ETA: 5s - loss: 8.5830 - acc: 0.451 - ETA: 4s - loss: 8.6039 - acc: 0.450 - ETA: 4s - loss: 8.5811 - acc: 0.451 - ETA: 4s - loss: 8.5734 - acc: 0.452 - ETA: 4s - loss: 8.5715 - acc: 0.452 - ETA: 4s - loss: 8.5975 - acc: 0.450 - ETA: 3s - loss: 8.6062 - acc: 0.450 - ETA: 3s - loss: 8.5918 - acc: 0.451 - ETA: 3s - loss: 8.5919 - acc: 0.451 - ETA: 3s - loss: 8.5910 - acc: 0.451 - ETA: 3s - loss: 8.5402 - acc: 0.454 - ETA: 3s - loss: 8.5388 - acc: 0.454 - ETA: 2s - loss: 8.5339 - acc: 0.455 - ETA: 2s - loss: 8.5189 - acc: 0.455 - ETA: 2s - loss: 8.5346 - acc: 0.455 - ETA: 2s - loss: 8.5326 - acc: 0.455 - ETA: 2s - loss: 8.5165 - acc: 0.456 - ETA: 1s - loss: 8.4946 - acc: 0.457 - ETA: 1s - loss: 8.4898 - acc: 0.458 - ETA: 1s - loss: 8.4876 - acc: 0.458 - ETA: 1s - loss: 8.4922 - acc: 0.458 - ETA: 1s - loss: 8.4832 - acc: 0.458 - ETA: 0s - loss: 8.4830 - acc: 0.458 - ETA: 0s - loss: 8.4807 - acc: 0.459 - ETA: 0s - loss: 8.4932 - acc: 0.458 - ETA: 0s - loss: 8.5022 - acc: 0.457 - ETA: 0s - loss: 8.5157 - acc: 0.456 - 13s 2ms/step - loss: 8.5092 - acc: 0.4572 - val_loss: 8.8826 - val_acc: 0.4263 Epoch 00003: val_loss improved from 8.97877 to 8.88263, saving model to saved_models/weights.best.Xception.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 12s - loss: 8.4483 - acc: 0.47 - ETA: 12s - loss: 8.3167 - acc: 0.48 - ETA: 12s - loss: 8.7698 - acc: 0.45 - ETA: 12s - loss: 8.9548 - acc: 0.44 - ETA: 11s - loss: 8.6929 - acc: 0.45 - ETA: 11s - loss: 8.4533 - acc: 0.47 - ETA: 11s - loss: 8.6051 - acc: 0.46 - ETA: 11s - loss: 8.4234 - acc: 0.47 - ETA: 11s - loss: 8.3920 - acc: 0.47 - ETA: 10s - loss: 8.4279 - acc: 0.47 - ETA: 10s - loss: 8.5696 - acc: 0.46 - ETA: 10s - loss: 8.4883 - acc: 0.46 - ETA: 10s - loss: 8.5319 - acc: 0.46 - ETA: 10s - loss: 8.4644 - acc: 0.46 - ETA: 10s - loss: 8.3926 - acc: 0.47 - ETA: 9s - loss: 8.4228 - acc: 0.4694 - ETA: 9s - loss: 8.4148 - acc: 0.470 - ETA: 9s - loss: 8.4568 - acc: 0.466 - ETA: 9s - loss: 8.4301 - acc: 0.467 - ETA: 9s - loss: 8.3794 - acc: 0.471 - ETA: 8s - loss: 8.4170 - acc: 0.468 - ETA: 8s - loss: 8.4438 - acc: 0.465 - ETA: 8s - loss: 8.4621 - acc: 0.465 - ETA: 8s - loss: 8.4208 - acc: 0.467 - ETA: 8s - loss: 8.4773 - acc: 0.464 - ETA: 7s - loss: 8.4451 - acc: 0.466 - ETA: 7s - loss: 8.4103 - acc: 0.468 - ETA: 7s - loss: 8.4158 - acc: 0.468 - ETA: 7s - loss: 8.4348 - acc: 0.466 - ETA: 7s - loss: 8.4266 - acc: 0.467 - ETA: 6s - loss: 8.4578 - acc: 0.465 - ETA: 6s - loss: 8.4754 - acc: 0.464 - ETA: 6s - loss: 8.4379 - acc: 0.466 - ETA: 6s - loss: 8.4081 - acc: 0.468 - ETA: 6s - loss: 8.3993 - acc: 0.469 - ETA: 5s - loss: 8.4071 - acc: 0.468 - ETA: 5s - loss: 8.4156 - acc: 0.467 - ETA: 5s - loss: 8.4656 - acc: 0.465 - ETA: 5s - loss: 8.4777 - acc: 0.464 - ETA: 5s - loss: 8.4987 - acc: 0.463 - ETA: 4s - loss: 8.4719 - acc: 0.464 - ETA: 4s - loss: 8.4598 - acc: 0.465 - ETA: 4s - loss: 8.4647 - acc: 0.465 - ETA: 4s - loss: 8.4667 - acc: 0.465 - ETA: 4s - loss: 8.4516 - acc: 0.466 - ETA: 4s - loss: 8.4690 - acc: 0.465 - ETA: 3s - loss: 8.4687 - acc: 0.464 - ETA: 3s - loss: 8.4706 - acc: 0.464 - ETA: 3s - loss: 8.4570 - acc: 0.465 - ETA: 3s - loss: 8.4665 - acc: 0.464 - ETA: 3s - loss: 8.4658 - acc: 0.464 - ETA: 2s - loss: 8.4613 - acc: 0.464 - ETA: 2s - loss: 8.4623 - acc: 0.464 - ETA: 2s - loss: 8.4554 - acc: 0.465 - ETA: 2s - loss: 8.4425 - acc: 0.465 - ETA: 2s - loss: 8.4589 - acc: 0.464 - ETA: 1s - loss: 8.4556 - acc: 0.464 - ETA: 1s - loss: 8.4401 - acc: 0.466 - ETA: 1s - loss: 8.4402 - acc: 0.465 - ETA: 1s - loss: 8.4478 - acc: 0.465 - ETA: 1s - loss: 8.4704 - acc: 0.463 - ETA: 0s - loss: 8.4785 - acc: 0.463 - ETA: 0s - loss: 8.4738 - acc: 0.463 - ETA: 0s - loss: 8.4845 - acc: 0.462 - ETA: 0s - loss: 8.4760 - acc: 0.463 - ETA: 0s - loss: 8.4593 - acc: 0.464 - 13s 2ms/step - loss: 8.4465 - acc: 0.4648 - val_loss: 8.7307 - val_acc: 0.4359 Epoch 00004: val_loss improved from 8.88263 to 8.73068, saving model to saved_models/weights.best.Xception.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 12s - loss: 9.0530 - acc: 0.43 - ETA: 12s - loss: 8.4326 - acc: 0.47 - ETA: 12s - loss: 8.3304 - acc: 0.46 - ETA: 12s - loss: 8.5636 - acc: 0.45 - ETA: 12s - loss: 8.2693 - acc: 0.47 - ETA: 11s - loss: 8.4900 - acc: 0.45 - ETA: 11s - loss: 8.1864 - acc: 0.47 - ETA: 11s - loss: 8.2877 - acc: 0.47 - ETA: 11s - loss: 8.3934 - acc: 0.46 - ETA: 11s - loss: 8.3234 - acc: 0.46 - ETA: 10s - loss: 8.3064 - acc: 0.46 - ETA: 10s - loss: 8.3158 - acc: 0.46 - ETA: 10s - loss: 8.3146 - acc: 0.47 - ETA: 10s - loss: 8.3715 - acc: 0.46 - ETA: 10s - loss: 8.4151 - acc: 0.46 - ETA: 9s - loss: 8.4219 - acc: 0.4631 - ETA: 9s - loss: 8.4531 - acc: 0.461 - ETA: 9s - loss: 8.4988 - acc: 0.458 - ETA: 9s - loss: 8.4817 - acc: 0.460 - ETA: 9s - loss: 8.4287 - acc: 0.463 - ETA: 8s - loss: 8.4535 - acc: 0.461 - ETA: 8s - loss: 8.4436 - acc: 0.462 - ETA: 8s - loss: 8.4583 - acc: 0.461 - ETA: 8s - loss: 8.4453 - acc: 0.462 - ETA: 8s - loss: 8.3993 - acc: 0.465 - ETA: 7s - loss: 8.4111 - acc: 0.465 - ETA: 7s - loss: 8.3777 - acc: 0.467 - ETA: 7s - loss: 8.3543 - acc: 0.468 - ETA: 7s - loss: 8.3452 - acc: 0.469 - ETA: 7s - loss: 8.3366 - acc: 0.470 - ETA: 6s - loss: 8.3491 - acc: 0.469 - ETA: 6s - loss: 8.3450 - acc: 0.470 - ETA: 6s - loss: 8.3420 - acc: 0.470 - ETA: 6s - loss: 8.3184 - acc: 0.471 - ETA: 6s - loss: 8.3223 - acc: 0.471 - ETA: 5s - loss: 8.3504 - acc: 0.470 - ETA: 5s - loss: 8.3508 - acc: 0.470 - ETA: 5s - loss: 8.3330 - acc: 0.471 - ETA: 5s - loss: 8.3343 - acc: 0.471 - ETA: 5s - loss: 8.3103 - acc: 0.473 - ETA: 5s - loss: 8.3021 - acc: 0.473 - ETA: 4s - loss: 8.2980 - acc: 0.474 - ETA: 4s - loss: 8.3084 - acc: 0.473 - ETA: 4s - loss: 8.3210 - acc: 0.473 - ETA: 4s - loss: 8.3294 - acc: 0.472 - ETA: 4s - loss: 8.3075 - acc: 0.473 - ETA: 3s - loss: 8.3038 - acc: 0.474 - ETA: 3s - loss: 8.3033 - acc: 0.474 - ETA: 3s - loss: 8.3049 - acc: 0.474 - ETA: 3s - loss: 8.3093 - acc: 0.474 - ETA: 3s - loss: 8.2993 - acc: 0.474 - ETA: 2s - loss: 8.2919 - acc: 0.475 - ETA: 2s - loss: 8.3094 - acc: 0.474 - ETA: 2s - loss: 8.3406 - acc: 0.472 - ETA: 2s - loss: 8.3363 - acc: 0.472 - ETA: 2s - loss: 8.3657 - acc: 0.470 - ETA: 1s - loss: 8.3688 - acc: 0.470 - ETA: 1s - loss: 8.3940 - acc: 0.469 - ETA: 1s - loss: 8.3839 - acc: 0.469 - ETA: 1s - loss: 8.3822 - acc: 0.469 - ETA: 1s - loss: 8.3732 - acc: 0.470 - ETA: 0s - loss: 8.3524 - acc: 0.471 - ETA: 0s - loss: 8.3399 - acc: 0.472 - ETA: 0s - loss: 8.3454 - acc: 0.472 - ETA: 0s - loss: 8.3589 - acc: 0.471 - ETA: 0s - loss: 8.3596 - acc: 0.471 - 13s 2ms/step - loss: 8.3411 - acc: 0.4725 - val_loss: 8.8475 - val_acc: 0.4311 Epoch 00005: val_loss did not improve Epoch 6/20 6680/6680 [==============================] - ETA: 12s - loss: 8.3428 - acc: 0.47 - ETA: 11s - loss: 8.7115 - acc: 0.44 - ETA: 11s - loss: 8.9121 - acc: 0.43 - ETA: 11s - loss: 8.8099 - acc: 0.44 - ETA: 11s - loss: 8.7565 - acc: 0.44 - ETA: 11s - loss: 8.5303 - acc: 0.46 - ETA: 11s - loss: 8.4720 - acc: 0.46 - ETA: 11s - loss: 8.3452 - acc: 0.47 - ETA: 10s - loss: 8.3350 - acc: 0.47 - ETA: 10s - loss: 8.2954 - acc: 0.47 - ETA: 10s - loss: 8.4004 - acc: 0.47 - ETA: 10s - loss: 8.3345 - acc: 0.47 - ETA: 10s - loss: 8.3081 - acc: 0.47 - ETA: 10s - loss: 8.3429 - acc: 0.47 - ETA: 9s - loss: 8.2843 - acc: 0.4780 - ETA: 9s - loss: 8.3265 - acc: 0.475 - ETA: 9s - loss: 8.2379 - acc: 0.481 - ETA: 9s - loss: 8.2418 - acc: 0.481 - ETA: 9s - loss: 8.2211 - acc: 0.482 - ETA: 8s - loss: 8.2810 - acc: 0.479 - ETA: 8s - loss: 8.2068 - acc: 0.483 - ETA: 8s - loss: 8.1992 - acc: 0.484 - ETA: 8s - loss: 8.1872 - acc: 0.485 - ETA: 8s - loss: 8.1427 - acc: 0.487 - ETA: 7s - loss: 8.1285 - acc: 0.487 - ETA: 7s - loss: 8.1270 - acc: 0.487 - ETA: 7s - loss: 8.1252 - acc: 0.487 - ETA: 7s - loss: 8.1285 - acc: 0.487 - ETA: 7s - loss: 8.1343 - acc: 0.487 - ETA: 7s - loss: 8.1336 - acc: 0.487 - ETA: 6s - loss: 8.0740 - acc: 0.491 - ETA: 6s - loss: 8.0888 - acc: 0.490 - ETA: 6s - loss: 8.0976 - acc: 0.490 - ETA: 6s - loss: 8.0992 - acc: 0.489 - ETA: 6s - loss: 8.0819 - acc: 0.490 - ETA: 5s - loss: 8.0784 - acc: 0.490 - ETA: 5s - loss: 8.0997 - acc: 0.489 - ETA: 5s - loss: 8.1174 - acc: 0.488 - ETA: 5s - loss: 8.1331 - acc: 0.486 - ETA: 5s - loss: 8.1464 - acc: 0.486 - ETA: 4s - loss: 8.1080 - acc: 0.488 - ETA: 4s - loss: 8.1320 - acc: 0.487 - ETA: 4s - loss: 8.1276 - acc: 0.487 - ETA: 4s - loss: 8.1517 - acc: 0.486 - ETA: 4s - loss: 8.1713 - acc: 0.484 - ETA: 3s - loss: 8.1899 - acc: 0.483 - ETA: 3s - loss: 8.1720 - acc: 0.484 - ETA: 3s - loss: 8.1663 - acc: 0.484 - ETA: 3s - loss: 8.1473 - acc: 0.486 - ETA: 3s - loss: 8.1412 - acc: 0.486 - ETA: 3s - loss: 8.1611 - acc: 0.485 - ETA: 2s - loss: 8.1359 - acc: 0.486 - ETA: 2s - loss: 8.1238 - acc: 0.487 - ETA: 2s - loss: 8.1156 - acc: 0.487 - ETA: 2s - loss: 8.1042 - acc: 0.488 - ETA: 2s - loss: 8.1099 - acc: 0.488 - ETA: 1s - loss: 8.1065 - acc: 0.488 - ETA: 1s - loss: 8.0934 - acc: 0.489 - ETA: 1s - loss: 8.1057 - acc: 0.488 - ETA: 1s - loss: 8.1184 - acc: 0.487 - ETA: 1s - loss: 8.1072 - acc: 0.488 - ETA: 0s - loss: 8.1020 - acc: 0.488 - ETA: 0s - loss: 8.1173 - acc: 0.487 - ETA: 0s - loss: 8.1042 - acc: 0.488 - ETA: 0s - loss: 8.0935 - acc: 0.489 - ETA: 0s - loss: 8.1007 - acc: 0.488 - 13s 2ms/step - loss: 8.1093 - acc: 0.4882 - val_loss: 8.6683 - val_acc: 0.4503 Epoch 00006: val_loss improved from 8.73068 to 8.66827, saving model to saved_models/weights.best.Xception.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 13s - loss: 6.9308 - acc: 0.57 - ETA: 12s - loss: 7.6303 - acc: 0.52 - ETA: 12s - loss: 7.5278 - acc: 0.53 - ETA: 12s - loss: 8.1442 - acc: 0.49 - ETA: 11s - loss: 7.9517 - acc: 0.50 - ETA: 11s - loss: 7.9323 - acc: 0.50 - ETA: 11s - loss: 7.7664 - acc: 0.51 - ETA: 11s - loss: 7.6380 - acc: 0.52 - ETA: 11s - loss: 7.6374 - acc: 0.52 - ETA: 10s - loss: 7.7118 - acc: 0.51 - ETA: 10s - loss: 7.7311 - acc: 0.51 - ETA: 10s - loss: 7.8061 - acc: 0.51 - ETA: 10s - loss: 7.8594 - acc: 0.50 - ETA: 10s - loss: 7.8437 - acc: 0.50 - ETA: 10s - loss: 7.8494 - acc: 0.50 - ETA: 9s - loss: 7.7886 - acc: 0.5100 - ETA: 9s - loss: 7.8421 - acc: 0.507 - ETA: 9s - loss: 7.8004 - acc: 0.510 - ETA: 9s - loss: 7.7914 - acc: 0.510 - ETA: 9s - loss: 7.7775 - acc: 0.511 - ETA: 8s - loss: 7.8447 - acc: 0.507 - ETA: 8s - loss: 7.8670 - acc: 0.505 - ETA: 8s - loss: 7.8947 - acc: 0.504 - ETA: 8s - loss: 7.9016 - acc: 0.504 - ETA: 8s - loss: 7.9465 - acc: 0.501 - ETA: 7s - loss: 7.8827 - acc: 0.505 - ETA: 7s - loss: 7.8899 - acc: 0.505 - ETA: 7s - loss: 7.8158 - acc: 0.509 - ETA: 7s - loss: 7.8277 - acc: 0.509 - ETA: 7s - loss: 7.8250 - acc: 0.509 - ETA: 6s - loss: 7.8421 - acc: 0.507 - ETA: 6s - loss: 7.8869 - acc: 0.504 - ETA: 6s - loss: 7.9118 - acc: 0.503 - ETA: 6s - loss: 7.9256 - acc: 0.502 - ETA: 6s - loss: 7.8885 - acc: 0.504 - ETA: 5s - loss: 7.9201 - acc: 0.502 - ETA: 5s - loss: 7.9281 - acc: 0.502 - ETA: 5s - loss: 7.8757 - acc: 0.505 - ETA: 5s - loss: 7.8806 - acc: 0.504 - ETA: 5s - loss: 7.9219 - acc: 0.502 - ETA: 4s - loss: 7.9096 - acc: 0.502 - ETA: 4s - loss: 7.9222 - acc: 0.501 - ETA: 4s - loss: 7.9216 - acc: 0.502 - ETA: 4s - loss: 7.8808 - acc: 0.504 - ETA: 4s - loss: 7.8740 - acc: 0.505 - ETA: 4s - loss: 7.8621 - acc: 0.505 - ETA: 3s - loss: 7.8668 - acc: 0.505 - ETA: 3s - loss: 7.8925 - acc: 0.503 - ETA: 3s - loss: 7.9217 - acc: 0.501 - ETA: 3s - loss: 7.8986 - acc: 0.503 - ETA: 3s - loss: 7.9113 - acc: 0.502 - ETA: 2s - loss: 7.9201 - acc: 0.502 - ETA: 2s - loss: 7.9199 - acc: 0.502 - ETA: 2s - loss: 7.9336 - acc: 0.501 - ETA: 2s - loss: 7.9288 - acc: 0.501 - ETA: 2s - loss: 7.9584 - acc: 0.499 - ETA: 1s - loss: 7.9594 - acc: 0.499 - ETA: 1s - loss: 7.9754 - acc: 0.498 - ETA: 1s - loss: 7.9928 - acc: 0.497 - ETA: 1s - loss: 7.9893 - acc: 0.497 - ETA: 1s - loss: 8.0142 - acc: 0.496 - ETA: 0s - loss: 8.0110 - acc: 0.496 - ETA: 0s - loss: 8.0042 - acc: 0.496 - ETA: 0s - loss: 7.9971 - acc: 0.497 - ETA: 0s - loss: 7.9956 - acc: 0.497 - ETA: 0s - loss: 7.9922 - acc: 0.497 - 13s 2ms/step - loss: 7.9954 - acc: 0.4975 - val_loss: 8.5826 - val_acc: 0.4611 Epoch 00007: val_loss improved from 8.66827 to 8.58259, saving model to saved_models/weights.best.Xception.hdf5 Epoch 8/20 6680/6680 [==============================] - ETA: 12s - loss: 7.8642 - acc: 0.51 - ETA: 12s - loss: 7.9248 - acc: 0.50 - ETA: 12s - loss: 7.7546 - acc: 0.51 - ETA: 12s - loss: 7.6125 - acc: 0.52 - ETA: 11s - loss: 7.4761 - acc: 0.53 - ETA: 11s - loss: 7.4095 - acc: 0.53 - ETA: 11s - loss: 7.7183 - acc: 0.51 - ETA: 11s - loss: 7.8426 - acc: 0.50 - ETA: 11s - loss: 7.7771 - acc: 0.51 - ETA: 10s - loss: 7.7857 - acc: 0.51 - ETA: 10s - loss: 7.7711 - acc: 0.51 - ETA: 10s - loss: 7.8355 - acc: 0.50 - ETA: 10s - loss: 7.7891 - acc: 0.51 - ETA: 10s - loss: 7.7175 - acc: 0.51 - ETA: 9s - loss: 7.5791 - acc: 0.5253 - ETA: 9s - loss: 7.5618 - acc: 0.526 - ETA: 9s - loss: 7.5911 - acc: 0.524 - ETA: 9s - loss: 7.6634 - acc: 0.520 - ETA: 9s - loss: 7.6758 - acc: 0.519 - ETA: 9s - loss: 7.7221 - acc: 0.516 - ETA: 8s - loss: 7.6998 - acc: 0.518 - ETA: 8s - loss: 7.7582 - acc: 0.514 - ETA: 8s - loss: 7.7763 - acc: 0.513 - ETA: 8s - loss: 7.8605 - acc: 0.508 - ETA: 8s - loss: 7.8555 - acc: 0.508 - ETA: 7s - loss: 7.8018 - acc: 0.511 - ETA: 7s - loss: 7.7993 - acc: 0.512 - ETA: 7s - loss: 7.8176 - acc: 0.510 - ETA: 7s - loss: 7.8037 - acc: 0.511 - ETA: 7s - loss: 7.8122 - acc: 0.511 - ETA: 6s - loss: 7.7963 - acc: 0.512 - ETA: 6s - loss: 7.7813 - acc: 0.513 - ETA: 6s - loss: 7.7628 - acc: 0.514 - ETA: 6s - loss: 7.7661 - acc: 0.514 - ETA: 6s - loss: 7.8024 - acc: 0.511 - ETA: 5s - loss: 7.8217 - acc: 0.510 - ETA: 5s - loss: 7.8978 - acc: 0.505 - ETA: 5s - loss: 7.9063 - acc: 0.505 - ETA: 5s - loss: 7.8828 - acc: 0.506 - ETA: 5s - loss: 7.9160 - acc: 0.504 - ETA: 4s - loss: 7.8999 - acc: 0.505 - ETA: 4s - loss: 7.9266 - acc: 0.503 - ETA: 4s - loss: 7.9122 - acc: 0.504 - ETA: 4s - loss: 7.9229 - acc: 0.503 - ETA: 4s - loss: 7.9075 - acc: 0.504 - ETA: 4s - loss: 7.9279 - acc: 0.503 - ETA: 3s - loss: 7.9240 - acc: 0.503 - ETA: 3s - loss: 7.9369 - acc: 0.502 - ETA: 3s - loss: 7.9328 - acc: 0.503 - ETA: 3s - loss: 7.9515 - acc: 0.502 - ETA: 3s - loss: 7.9598 - acc: 0.501 - ETA: 2s - loss: 7.9617 - acc: 0.501 - ETA: 2s - loss: 7.9605 - acc: 0.501 - ETA: 2s - loss: 7.9595 - acc: 0.501 - ETA: 2s - loss: 7.9470 - acc: 0.502 - ETA: 2s - loss: 7.9375 - acc: 0.503 - ETA: 1s - loss: 7.9567 - acc: 0.501 - ETA: 1s - loss: 7.9688 - acc: 0.501 - ETA: 1s - loss: 7.9473 - acc: 0.502 - ETA: 1s - loss: 7.9400 - acc: 0.502 - ETA: 1s - loss: 7.9446 - acc: 0.502 - ETA: 0s - loss: 7.9408 - acc: 0.502 - ETA: 0s - loss: 7.9586 - acc: 0.501 - ETA: 0s - loss: 7.9526 - acc: 0.501 - ETA: 0s - loss: 7.9476 - acc: 0.502 - ETA: 0s - loss: 7.9502 - acc: 0.502 - 13s 2ms/step - loss: 7.9408 - acc: 0.5025 - val_loss: 8.6324 - val_acc: 0.4491 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 12s - loss: 7.7357 - acc: 0.52 - ETA: 12s - loss: 7.3575 - acc: 0.54 - ETA: 12s - loss: 7.5913 - acc: 0.52 - ETA: 12s - loss: 7.7889 - acc: 0.51 - ETA: 11s - loss: 7.7785 - acc: 0.51 - ETA: 11s - loss: 7.6699 - acc: 0.52 - ETA: 11s - loss: 7.6564 - acc: 0.52 - ETA: 11s - loss: 7.7501 - acc: 0.51 - ETA: 11s - loss: 7.8919 - acc: 0.50 - ETA: 10s - loss: 7.8986 - acc: 0.50 - ETA: 10s - loss: 8.0018 - acc: 0.50 - ETA: 10s - loss: 7.9980 - acc: 0.50 - ETA: 10s - loss: 8.0027 - acc: 0.50 - ETA: 10s - loss: 8.0446 - acc: 0.49 - ETA: 9s - loss: 8.0133 - acc: 0.4993 - ETA: 9s - loss: 8.0654 - acc: 0.495 - ETA: 9s - loss: 8.0651 - acc: 0.495 - ETA: 9s - loss: 8.0558 - acc: 0.496 - ETA: 9s - loss: 8.0461 - acc: 0.497 - ETA: 8s - loss: 8.0307 - acc: 0.498 - ETA: 8s - loss: 8.0611 - acc: 0.496 - ETA: 8s - loss: 8.0244 - acc: 0.499 - ETA: 8s - loss: 8.0101 - acc: 0.499 - ETA: 8s - loss: 7.9853 - acc: 0.501 - ETA: 8s - loss: 7.9203 - acc: 0.505 - ETA: 7s - loss: 7.9185 - acc: 0.504 - ETA: 7s - loss: 7.9430 - acc: 0.503 - ETA: 7s - loss: 7.9464 - acc: 0.502 - ETA: 7s - loss: 7.8707 - acc: 0.507 - ETA: 7s - loss: 7.8877 - acc: 0.506 - ETA: 6s - loss: 7.8880 - acc: 0.506 - ETA: 6s - loss: 7.8883 - acc: 0.506 - ETA: 6s - loss: 7.8746 - acc: 0.507 - ETA: 6s - loss: 7.9117 - acc: 0.505 - ETA: 6s - loss: 7.8804 - acc: 0.506 - ETA: 5s - loss: 7.9129 - acc: 0.504 - ETA: 5s - loss: 7.9255 - acc: 0.504 - ETA: 5s - loss: 7.9135 - acc: 0.504 - ETA: 5s - loss: 7.9024 - acc: 0.505 - ETA: 5s - loss: 7.9111 - acc: 0.504 - ETA: 4s - loss: 7.8834 - acc: 0.506 - ETA: 4s - loss: 7.8914 - acc: 0.505 - ETA: 4s - loss: 7.8931 - acc: 0.505 - ETA: 4s - loss: 7.8748 - acc: 0.506 - ETA: 4s - loss: 7.8965 - acc: 0.505 - ETA: 3s - loss: 7.8771 - acc: 0.506 - ETA: 3s - loss: 7.9085 - acc: 0.504 - ETA: 3s - loss: 7.8948 - acc: 0.505 - ETA: 3s - loss: 7.9313 - acc: 0.503 - ETA: 3s - loss: 7.9113 - acc: 0.504 - ETA: 3s - loss: 7.8974 - acc: 0.505 - ETA: 2s - loss: 7.8974 - acc: 0.505 - ETA: 2s - loss: 7.8793 - acc: 0.506 - ETA: 2s - loss: 7.8886 - acc: 0.506 - ETA: 2s - loss: 7.8917 - acc: 0.506 - ETA: 2s - loss: 7.9063 - acc: 0.505 - ETA: 1s - loss: 7.9203 - acc: 0.504 - ETA: 1s - loss: 7.8941 - acc: 0.506 - ETA: 1s - loss: 7.8887 - acc: 0.506 - ETA: 1s - loss: 7.8996 - acc: 0.505 - ETA: 1s - loss: 7.9132 - acc: 0.504 - ETA: 0s - loss: 7.9112 - acc: 0.505 - ETA: 0s - loss: 7.8804 - acc: 0.506 - ETA: 0s - loss: 7.8869 - acc: 0.506 - ETA: 0s - loss: 7.8961 - acc: 0.505 - ETA: 0s - loss: 7.9083 - acc: 0.505 - 13s 2ms/step - loss: 7.9150 - acc: 0.5046 - val_loss: 8.5444 - val_acc: 0.4563 Epoch 00009: val_loss improved from 8.58259 to 8.54443, saving model to saved_models/weights.best.Xception.hdf5 Epoch 10/20 6680/6680 [==============================] - ETA: 12s - loss: 8.2202 - acc: 0.49 - ETA: 12s - loss: 8.4341 - acc: 0.47 - ETA: 12s - loss: 8.2554 - acc: 0.48 - ETA: 11s - loss: 8.3347 - acc: 0.48 - ETA: 11s - loss: 8.2801 - acc: 0.48 - ETA: 11s - loss: 8.1636 - acc: 0.49 - ETA: 11s - loss: 8.2408 - acc: 0.48 - ETA: 11s - loss: 8.2584 - acc: 0.48 - ETA: 11s - loss: 8.1825 - acc: 0.49 - ETA: 10s - loss: 8.1718 - acc: 0.49 - ETA: 10s - loss: 8.1469 - acc: 0.49 - ETA: 10s - loss: 8.2336 - acc: 0.48 - ETA: 10s - loss: 8.2946 - acc: 0.48 - ETA: 10s - loss: 8.2432 - acc: 0.48 - ETA: 9s - loss: 8.2718 - acc: 0.4847 - ETA: 9s - loss: 8.2732 - acc: 0.483 - ETA: 9s - loss: 8.1947 - acc: 0.488 - ETA: 9s - loss: 8.1701 - acc: 0.489 - ETA: 9s - loss: 8.0850 - acc: 0.494 - ETA: 8s - loss: 8.1321 - acc: 0.491 - ETA: 8s - loss: 8.0672 - acc: 0.495 - ETA: 8s - loss: 8.0076 - acc: 0.499 - ETA: 8s - loss: 7.9821 - acc: 0.500 - ETA: 8s - loss: 8.0256 - acc: 0.498 - ETA: 8s - loss: 8.0069 - acc: 0.499 - ETA: 7s - loss: 8.0461 - acc: 0.497 - ETA: 7s - loss: 8.0709 - acc: 0.495 - ETA: 7s - loss: 8.0348 - acc: 0.497 - ETA: 7s - loss: 8.0468 - acc: 0.496 - ETA: 7s - loss: 8.0387 - acc: 0.497 - ETA: 6s - loss: 8.0005 - acc: 0.499 - ETA: 6s - loss: 7.9872 - acc: 0.500 - ETA: 6s - loss: 8.0236 - acc: 0.498 - ETA: 6s - loss: 8.0199 - acc: 0.498 - ETA: 6s - loss: 8.0026 - acc: 0.499 - ETA: 5s - loss: 7.9997 - acc: 0.500 - ETA: 5s - loss: 7.9840 - acc: 0.501 - ETA: 5s - loss: 7.9606 - acc: 0.502 - ETA: 5s - loss: 7.9714 - acc: 0.502 - ETA: 5s - loss: 7.9736 - acc: 0.502 - ETA: 4s - loss: 7.9908 - acc: 0.500 - ETA: 4s - loss: 8.0087 - acc: 0.499 - ETA: 4s - loss: 7.9876 - acc: 0.500 - ETA: 4s - loss: 7.9343 - acc: 0.504 - ETA: 4s - loss: 7.9199 - acc: 0.504 - ETA: 4s - loss: 7.9194 - acc: 0.505 - ETA: 3s - loss: 7.9308 - acc: 0.504 - ETA: 3s - loss: 7.9103 - acc: 0.505 - ETA: 3s - loss: 7.8976 - acc: 0.505 - ETA: 3s - loss: 7.9082 - acc: 0.505 - ETA: 3s - loss: 7.9144 - acc: 0.504 - ETA: 2s - loss: 7.9141 - acc: 0.505 - ETA: 2s - loss: 7.8956 - acc: 0.506 - ETA: 2s - loss: 7.8848 - acc: 0.506 - ETA: 2s - loss: 7.8763 - acc: 0.507 - ETA: 2s - loss: 7.9031 - acc: 0.505 - ETA: 1s - loss: 7.8888 - acc: 0.506 - ETA: 1s - loss: 7.8534 - acc: 0.508 - ETA: 1s - loss: 7.8628 - acc: 0.508 - ETA: 1s - loss: 7.8592 - acc: 0.508 - ETA: 1s - loss: 7.8551 - acc: 0.508 - ETA: 0s - loss: 7.8557 - acc: 0.508 - ETA: 0s - loss: 7.8520 - acc: 0.508 - ETA: 0s - loss: 7.8475 - acc: 0.508 - ETA: 0s - loss: 7.8458 - acc: 0.509 - ETA: 0s - loss: 7.8637 - acc: 0.508 - 13s 2ms/step - loss: 7.8660 - acc: 0.5079 - val_loss: 8.5807 - val_acc: 0.4527 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 12s - loss: 8.3814 - acc: 0.48 - ETA: 12s - loss: 8.2202 - acc: 0.49 - ETA: 12s - loss: 7.8268 - acc: 0.51 - ETA: 11s - loss: 7.8458 - acc: 0.51 - ETA: 11s - loss: 7.8480 - acc: 0.51 - ETA: 11s - loss: 7.9638 - acc: 0.50 - ETA: 11s - loss: 8.0273 - acc: 0.50 - ETA: 11s - loss: 7.9103 - acc: 0.50 - ETA: 11s - loss: 7.8375 - acc: 0.51 - ETA: 10s - loss: 7.7791 - acc: 0.51 - ETA: 10s - loss: 7.8376 - acc: 0.51 - ETA: 10s - loss: 7.8292 - acc: 0.51 - ETA: 10s - loss: 7.8345 - acc: 0.51 - ETA: 10s - loss: 7.7569 - acc: 0.51 - ETA: 9s - loss: 7.7986 - acc: 0.5140 - ETA: 9s - loss: 7.8451 - acc: 0.511 - ETA: 9s - loss: 7.9051 - acc: 0.507 - ETA: 9s - loss: 7.9136 - acc: 0.507 - ETA: 9s - loss: 7.8873 - acc: 0.508 - ETA: 9s - loss: 7.8744 - acc: 0.509 - ETA: 8s - loss: 7.8909 - acc: 0.508 - ETA: 8s - loss: 7.8125 - acc: 0.513 - ETA: 8s - loss: 7.9191 - acc: 0.506 - ETA: 8s - loss: 7.8856 - acc: 0.508 - ETA: 8s - loss: 7.9247 - acc: 0.506 - ETA: 7s - loss: 7.9303 - acc: 0.505 - ETA: 7s - loss: 7.8932 - acc: 0.507 - ETA: 7s - loss: 7.8761 - acc: 0.508 - ETA: 7s - loss: 7.9047 - acc: 0.507 - ETA: 7s - loss: 7.8831 - acc: 0.508 - ETA: 6s - loss: 7.8497 - acc: 0.510 - ETA: 6s - loss: 7.8163 - acc: 0.512 - ETA: 6s - loss: 7.8188 - acc: 0.512 - ETA: 6s - loss: 7.8022 - acc: 0.513 - ETA: 6s - loss: 7.8740 - acc: 0.509 - ETA: 5s - loss: 7.9015 - acc: 0.507 - ETA: 5s - loss: 7.8535 - acc: 0.510 - ETA: 5s - loss: 7.8377 - acc: 0.511 - ETA: 5s - loss: 7.8264 - acc: 0.512 - ETA: 5s - loss: 7.7839 - acc: 0.514 - ETA: 4s - loss: 7.7785 - acc: 0.515 - ETA: 4s - loss: 7.7949 - acc: 0.514 - ETA: 4s - loss: 7.7644 - acc: 0.515 - ETA: 4s - loss: 7.7602 - acc: 0.516 - ETA: 4s - loss: 7.7777 - acc: 0.515 - ETA: 4s - loss: 7.8225 - acc: 0.512 - ETA: 3s - loss: 7.8143 - acc: 0.512 - ETA: 3s - loss: 7.8295 - acc: 0.511 - ETA: 3s - loss: 7.8210 - acc: 0.512 - ETA: 3s - loss: 7.7963 - acc: 0.513 - ETA: 3s - loss: 7.8022 - acc: 0.512 - ETA: 2s - loss: 7.8102 - acc: 0.512 - ETA: 2s - loss: 7.8210 - acc: 0.511 - ETA: 2s - loss: 7.8463 - acc: 0.510 - ETA: 2s - loss: 7.8589 - acc: 0.509 - ETA: 2s - loss: 7.8836 - acc: 0.508 - ETA: 1s - loss: 7.9026 - acc: 0.506 - ETA: 1s - loss: 7.8977 - acc: 0.507 - ETA: 1s - loss: 7.9004 - acc: 0.506 - ETA: 1s - loss: 7.8980 - acc: 0.506 - ETA: 1s - loss: 7.8900 - acc: 0.507 - ETA: 0s - loss: 7.8850 - acc: 0.507 - ETA: 0s - loss: 7.8741 - acc: 0.508 - ETA: 0s - loss: 7.8719 - acc: 0.508 - ETA: 0s - loss: 7.8566 - acc: 0.509 - ETA: 0s - loss: 7.8841 - acc: 0.507 - 13s 2ms/step - loss: 7.8766 - acc: 0.5081 - val_loss: 8.6384 - val_acc: 0.4515 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 12s - loss: 8.2203 - acc: 0.49 - ETA: 12s - loss: 8.3009 - acc: 0.48 - ETA: 12s - loss: 7.8637 - acc: 0.51 - ETA: 11s - loss: 7.8320 - acc: 0.51 - ETA: 11s - loss: 7.9563 - acc: 0.50 - ETA: 11s - loss: 8.0003 - acc: 0.50 - ETA: 11s - loss: 7.9478 - acc: 0.50 - ETA: 11s - loss: 8.0167 - acc: 0.49 - ETA: 11s - loss: 7.9322 - acc: 0.50 - ETA: 11s - loss: 7.9046 - acc: 0.50 - ETA: 10s - loss: 7.8678 - acc: 0.50 - ETA: 10s - loss: 7.8837 - acc: 0.50 - ETA: 10s - loss: 7.8421 - acc: 0.50 - ETA: 10s - loss: 7.8576 - acc: 0.50 - ETA: 10s - loss: 7.7206 - acc: 0.51 - ETA: 9s - loss: 7.8123 - acc: 0.5119 - ETA: 9s - loss: 7.8571 - acc: 0.508 - ETA: 9s - loss: 7.8862 - acc: 0.507 - ETA: 9s - loss: 7.8915 - acc: 0.506 - ETA: 9s - loss: 7.8653 - acc: 0.508 - ETA: 9s - loss: 7.8668 - acc: 0.508 - ETA: 8s - loss: 7.8243 - acc: 0.510 - ETA: 8s - loss: 7.7799 - acc: 0.513 - ETA: 8s - loss: 7.7935 - acc: 0.512 - ETA: 8s - loss: 7.8041 - acc: 0.512 - ETA: 7s - loss: 7.7480 - acc: 0.515 - ETA: 7s - loss: 7.7894 - acc: 0.512 - ETA: 7s - loss: 7.7702 - acc: 0.513 - ETA: 7s - loss: 7.7733 - acc: 0.513 - ETA: 7s - loss: 7.7932 - acc: 0.512 - ETA: 7s - loss: 7.7813 - acc: 0.513 - ETA: 6s - loss: 7.8131 - acc: 0.511 - ETA: 6s - loss: 7.8255 - acc: 0.510 - ETA: 6s - loss: 7.8323 - acc: 0.510 - ETA: 6s - loss: 7.8202 - acc: 0.511 - ETA: 6s - loss: 7.8179 - acc: 0.511 - ETA: 5s - loss: 7.7983 - acc: 0.512 - ETA: 5s - loss: 7.8114 - acc: 0.511 - ETA: 5s - loss: 7.8012 - acc: 0.512 - ETA: 5s - loss: 7.8220 - acc: 0.510 - ETA: 5s - loss: 7.8542 - acc: 0.508 - ETA: 4s - loss: 7.8169 - acc: 0.511 - ETA: 4s - loss: 7.8134 - acc: 0.511 - ETA: 4s - loss: 7.7970 - acc: 0.512 - ETA: 4s - loss: 7.7885 - acc: 0.513 - ETA: 4s - loss: 7.7770 - acc: 0.513 - ETA: 3s - loss: 7.7969 - acc: 0.512 - ETA: 3s - loss: 7.8058 - acc: 0.511 - ETA: 3s - loss: 7.7747 - acc: 0.513 - ETA: 3s - loss: 7.7675 - acc: 0.514 - ETA: 3s - loss: 7.7416 - acc: 0.516 - ETA: 2s - loss: 7.7447 - acc: 0.515 - ETA: 2s - loss: 7.7447 - acc: 0.515 - ETA: 2s - loss: 7.7326 - acc: 0.516 - ETA: 2s - loss: 7.7298 - acc: 0.516 - ETA: 2s - loss: 7.7587 - acc: 0.515 - ETA: 1s - loss: 7.7650 - acc: 0.514 - ETA: 1s - loss: 7.7734 - acc: 0.514 - ETA: 1s - loss: 7.7833 - acc: 0.513 - ETA: 1s - loss: 7.7913 - acc: 0.512 - ETA: 1s - loss: 7.8010 - acc: 0.512 - ETA: 0s - loss: 7.8182 - acc: 0.511 - ETA: 0s - loss: 7.8313 - acc: 0.510 - ETA: 0s - loss: 7.8181 - acc: 0.511 - ETA: 0s - loss: 7.8391 - acc: 0.509 - ETA: 0s - loss: 7.8351 - acc: 0.510 - 13s 2ms/step - loss: 7.8618 - acc: 0.5085 - val_loss: 8.5911 - val_acc: 0.4551 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 12s - loss: 7.0920 - acc: 0.56 - ETA: 12s - loss: 7.4949 - acc: 0.53 - ETA: 12s - loss: 7.5170 - acc: 0.53 - ETA: 12s - loss: 7.7157 - acc: 0.52 - ETA: 11s - loss: 7.8488 - acc: 0.51 - ETA: 11s - loss: 7.7651 - acc: 0.51 - ETA: 11s - loss: 7.6689 - acc: 0.52 - ETA: 11s - loss: 7.6382 - acc: 0.52 - ETA: 11s - loss: 7.5764 - acc: 0.52 - ETA: 10s - loss: 7.7697 - acc: 0.51 - ETA: 10s - loss: 7.6349 - acc: 0.52 - ETA: 10s - loss: 7.6247 - acc: 0.52 - ETA: 10s - loss: 7.7077 - acc: 0.51 - ETA: 10s - loss: 7.6407 - acc: 0.52 - ETA: 9s - loss: 7.5933 - acc: 0.5267 - ETA: 9s - loss: 7.5724 - acc: 0.528 - ETA: 9s - loss: 7.6674 - acc: 0.522 - ETA: 9s - loss: 7.7071 - acc: 0.520 - ETA: 9s - loss: 7.6840 - acc: 0.521 - ETA: 9s - loss: 7.7672 - acc: 0.516 - ETA: 8s - loss: 7.7515 - acc: 0.516 - ETA: 8s - loss: 7.7654 - acc: 0.515 - ETA: 8s - loss: 7.7572 - acc: 0.516 - ETA: 8s - loss: 7.7228 - acc: 0.518 - ETA: 8s - loss: 7.7298 - acc: 0.518 - ETA: 7s - loss: 7.7362 - acc: 0.518 - ETA: 7s - loss: 7.7736 - acc: 0.515 - ETA: 7s - loss: 7.7556 - acc: 0.516 - ETA: 7s - loss: 7.8105 - acc: 0.513 - ETA: 7s - loss: 7.8403 - acc: 0.511 - ETA: 6s - loss: 7.8214 - acc: 0.512 - ETA: 6s - loss: 7.8339 - acc: 0.511 - ETA: 6s - loss: 7.8761 - acc: 0.509 - ETA: 6s - loss: 7.8775 - acc: 0.508 - ETA: 6s - loss: 7.8983 - acc: 0.507 - ETA: 5s - loss: 7.9051 - acc: 0.506 - ETA: 5s - loss: 7.9052 - acc: 0.506 - ETA: 5s - loss: 7.9023 - acc: 0.506 - ETA: 5s - loss: 7.9187 - acc: 0.505 - ETA: 5s - loss: 7.9037 - acc: 0.506 - ETA: 4s - loss: 7.9296 - acc: 0.505 - ETA: 4s - loss: 7.9326 - acc: 0.505 - ETA: 4s - loss: 7.9440 - acc: 0.504 - ETA: 4s - loss: 7.9101 - acc: 0.506 - ETA: 4s - loss: 7.8776 - acc: 0.508 - ETA: 4s - loss: 7.8500 - acc: 0.510 - ETA: 3s - loss: 7.8304 - acc: 0.511 - ETA: 3s - loss: 7.8419 - acc: 0.510 - ETA: 3s - loss: 7.8398 - acc: 0.511 - ETA: 3s - loss: 7.8506 - acc: 0.510 - ETA: 3s - loss: 7.8673 - acc: 0.509 - ETA: 2s - loss: 7.8586 - acc: 0.510 - ETA: 2s - loss: 7.8594 - acc: 0.510 - ETA: 2s - loss: 7.8543 - acc: 0.510 - ETA: 2s - loss: 7.8385 - acc: 0.511 - ETA: 2s - loss: 7.8425 - acc: 0.510 - ETA: 1s - loss: 7.8154 - acc: 0.512 - ETA: 1s - loss: 7.8112 - acc: 0.512 - ETA: 1s - loss: 7.8121 - acc: 0.512 - ETA: 1s - loss: 7.7884 - acc: 0.514 - ETA: 1s - loss: 7.7875 - acc: 0.514 - ETA: 0s - loss: 7.7893 - acc: 0.514 - ETA: 0s - loss: 7.7856 - acc: 0.514 - ETA: 0s - loss: 7.7823 - acc: 0.514 - ETA: 0s - loss: 7.7802 - acc: 0.514 - ETA: 0s - loss: 7.7991 - acc: 0.513 - 13s 2ms/step - loss: 7.8215 - acc: 0.5123 - val_loss: 8.5462 - val_acc: 0.4599 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 12s - loss: 8.0590 - acc: 0.50 - ETA: 12s - loss: 8.3814 - acc: 0.48 - ETA: 12s - loss: 8.4351 - acc: 0.47 - ETA: 11s - loss: 8.0654 - acc: 0.49 - ETA: 11s - loss: 8.1609 - acc: 0.49 - ETA: 11s - loss: 8.0645 - acc: 0.49 - ETA: 11s - loss: 7.8335 - acc: 0.51 - ETA: 11s - loss: 7.8214 - acc: 0.51 - ETA: 11s - loss: 7.8057 - acc: 0.51 - ETA: 10s - loss: 7.6376 - acc: 0.52 - ETA: 10s - loss: 7.7286 - acc: 0.51 - ETA: 10s - loss: 7.6353 - acc: 0.52 - ETA: 10s - loss: 7.6679 - acc: 0.52 - ETA: 10s - loss: 7.7419 - acc: 0.51 - ETA: 9s - loss: 7.7738 - acc: 0.5147 - ETA: 9s - loss: 7.7211 - acc: 0.518 - ETA: 9s - loss: 7.7504 - acc: 0.516 - ETA: 9s - loss: 7.7676 - acc: 0.515 - ETA: 9s - loss: 7.7420 - acc: 0.516 - ETA: 8s - loss: 7.7417 - acc: 0.516 - ETA: 8s - loss: 7.7608 - acc: 0.515 - ETA: 8s - loss: 7.7377 - acc: 0.516 - ETA: 8s - loss: 7.7657 - acc: 0.515 - ETA: 8s - loss: 7.7780 - acc: 0.514 - ETA: 8s - loss: 7.7828 - acc: 0.514 - ETA: 7s - loss: 7.8120 - acc: 0.512 - ETA: 7s - loss: 7.7495 - acc: 0.516 - ETA: 7s - loss: 7.7438 - acc: 0.516 - ETA: 7s - loss: 7.6945 - acc: 0.519 - ETA: 7s - loss: 7.7228 - acc: 0.518 - ETA: 6s - loss: 7.7181 - acc: 0.518 - ETA: 6s - loss: 7.6884 - acc: 0.520 - ETA: 6s - loss: 7.7045 - acc: 0.519 - ETA: 6s - loss: 7.7303 - acc: 0.517 - ETA: 6s - loss: 7.7531 - acc: 0.516 - ETA: 5s - loss: 7.7929 - acc: 0.513 - ETA: 5s - loss: 7.7923 - acc: 0.513 - ETA: 5s - loss: 7.7909 - acc: 0.513 - ETA: 5s - loss: 7.7853 - acc: 0.514 - ETA: 5s - loss: 7.7841 - acc: 0.514 - ETA: 4s - loss: 7.8144 - acc: 0.512 - ETA: 4s - loss: 7.8053 - acc: 0.513 - ETA: 4s - loss: 7.7956 - acc: 0.513 - ETA: 4s - loss: 7.8053 - acc: 0.513 - ETA: 4s - loss: 7.8181 - acc: 0.512 - ETA: 3s - loss: 7.8093 - acc: 0.513 - ETA: 3s - loss: 7.8592 - acc: 0.510 - ETA: 3s - loss: 7.8835 - acc: 0.508 - ETA: 3s - loss: 7.8772 - acc: 0.509 - ETA: 3s - loss: 7.8583 - acc: 0.510 - ETA: 3s - loss: 7.8148 - acc: 0.512 - ETA: 2s - loss: 7.8319 - acc: 0.511 - ETA: 2s - loss: 7.8119 - acc: 0.513 - ETA: 2s - loss: 7.8423 - acc: 0.511 - ETA: 2s - loss: 7.8140 - acc: 0.513 - ETA: 2s - loss: 7.7962 - acc: 0.514 - ETA: 1s - loss: 7.7923 - acc: 0.514 - ETA: 1s - loss: 7.7890 - acc: 0.514 - ETA: 1s - loss: 7.7918 - acc: 0.514 - ETA: 1s - loss: 7.8034 - acc: 0.513 - ETA: 1s - loss: 7.7834 - acc: 0.514 - ETA: 0s - loss: 7.8087 - acc: 0.513 - ETA: 0s - loss: 7.8100 - acc: 0.513 - ETA: 0s - loss: 7.7937 - acc: 0.514 - ETA: 0s - loss: 7.7854 - acc: 0.514 - ETA: 0s - loss: 7.7945 - acc: 0.514 - 13s 2ms/step - loss: 7.8000 - acc: 0.5139 - val_loss: 8.5403 - val_acc: 0.4647 Epoch 00014: val_loss improved from 8.54443 to 8.54030, saving model to saved_models/weights.best.Xception.hdf5 Epoch 15/20 6680/6680 [==============================] - ETA: 12s - loss: 7.5755 - acc: 0.53 - ETA: 12s - loss: 8.2202 - acc: 0.49 - ETA: 12s - loss: 8.1128 - acc: 0.49 - ETA: 11s - loss: 8.3008 - acc: 0.48 - ETA: 11s - loss: 8.2525 - acc: 0.48 - ETA: 11s - loss: 8.4083 - acc: 0.47 - ETA: 11s - loss: 8.3668 - acc: 0.48 - ETA: 11s - loss: 8.2679 - acc: 0.48 - ETA: 11s - loss: 8.2097 - acc: 0.48 - ETA: 10s - loss: 8.0012 - acc: 0.50 - ETA: 10s - loss: 8.0024 - acc: 0.50 - ETA: 10s - loss: 7.9937 - acc: 0.50 - ETA: 10s - loss: 7.9641 - acc: 0.50 - ETA: 10s - loss: 7.8903 - acc: 0.50 - ETA: 9s - loss: 7.8626 - acc: 0.5100 - ETA: 9s - loss: 7.9454 - acc: 0.505 - ETA: 9s - loss: 7.9805 - acc: 0.502 - ETA: 9s - loss: 7.9987 - acc: 0.501 - ETA: 9s - loss: 7.9425 - acc: 0.505 - ETA: 8s - loss: 7.9806 - acc: 0.503 - ETA: 8s - loss: 7.9537 - acc: 0.504 - ETA: 8s - loss: 7.9439 - acc: 0.505 - ETA: 8s - loss: 7.9419 - acc: 0.505 - ETA: 8s - loss: 7.9266 - acc: 0.506 - ETA: 7s - loss: 7.8223 - acc: 0.513 - ETA: 7s - loss: 7.7964 - acc: 0.514 - ETA: 7s - loss: 7.8002 - acc: 0.514 - ETA: 7s - loss: 7.8672 - acc: 0.510 - ETA: 7s - loss: 7.8256 - acc: 0.512 - ETA: 7s - loss: 7.8280 - acc: 0.512 - ETA: 6s - loss: 7.8182 - acc: 0.513 - ETA: 6s - loss: 7.8560 - acc: 0.510 - ETA: 6s - loss: 7.8525 - acc: 0.511 - ETA: 6s - loss: 7.8332 - acc: 0.512 - ETA: 6s - loss: 7.8166 - acc: 0.513 - ETA: 5s - loss: 7.8099 - acc: 0.513 - ETA: 5s - loss: 7.8210 - acc: 0.513 - ETA: 5s - loss: 7.8188 - acc: 0.513 - ETA: 5s - loss: 7.8250 - acc: 0.513 - ETA: 5s - loss: 7.8271 - acc: 0.512 - ETA: 4s - loss: 7.8328 - acc: 0.512 - ETA: 4s - loss: 7.8267 - acc: 0.512 - ETA: 4s - loss: 7.8096 - acc: 0.514 - ETA: 4s - loss: 7.8152 - acc: 0.513 - ETA: 4s - loss: 7.8099 - acc: 0.514 - ETA: 3s - loss: 7.8118 - acc: 0.513 - ETA: 3s - loss: 7.7862 - acc: 0.515 - ETA: 3s - loss: 7.7935 - acc: 0.515 - ETA: 3s - loss: 7.8100 - acc: 0.513 - ETA: 3s - loss: 7.7909 - acc: 0.515 - ETA: 3s - loss: 7.7994 - acc: 0.514 - ETA: 2s - loss: 7.7937 - acc: 0.514 - ETA: 2s - loss: 7.7875 - acc: 0.515 - ETA: 2s - loss: 7.8006 - acc: 0.514 - ETA: 2s - loss: 7.8113 - acc: 0.513 - ETA: 2s - loss: 7.8070 - acc: 0.513 - ETA: 1s - loss: 7.8186 - acc: 0.513 - ETA: 1s - loss: 7.8355 - acc: 0.511 - ETA: 1s - loss: 7.8257 - acc: 0.512 - ETA: 1s - loss: 7.8365 - acc: 0.511 - ETA: 1s - loss: 7.8455 - acc: 0.511 - ETA: 0s - loss: 7.8385 - acc: 0.511 - ETA: 0s - loss: 7.8241 - acc: 0.512 - ETA: 0s - loss: 7.8227 - acc: 0.512 - ETA: 0s - loss: 7.8065 - acc: 0.513 - ETA: 0s - loss: 7.8104 - acc: 0.513 - 13s 2ms/step - loss: 7.7989 - acc: 0.5144 - val_loss: 8.4653 - val_acc: 0.4659 Epoch 00015: val_loss improved from 8.54030 to 8.46532, saving model to saved_models/weights.best.Xception.hdf5 Epoch 16/20 6680/6680 [==============================] - ETA: 12s - loss: 7.5755 - acc: 0.53 - ETA: 12s - loss: 7.1207 - acc: 0.55 - ETA: 11s - loss: 7.2186 - acc: 0.55 - ETA: 11s - loss: 6.9451 - acc: 0.56 - ETA: 11s - loss: 7.1357 - acc: 0.55 - ETA: 11s - loss: 7.3433 - acc: 0.54 - ETA: 11s - loss: 7.4916 - acc: 0.53 - ETA: 11s - loss: 7.4215 - acc: 0.53 - ETA: 10s - loss: 7.3849 - acc: 0.54 - ETA: 10s - loss: 7.4040 - acc: 0.54 - ETA: 10s - loss: 7.3610 - acc: 0.54 - ETA: 10s - loss: 7.2947 - acc: 0.54 - ETA: 10s - loss: 7.2937 - acc: 0.54 - ETA: 10s - loss: 7.3829 - acc: 0.54 - ETA: 9s - loss: 7.3635 - acc: 0.5420 - ETA: 9s - loss: 7.3667 - acc: 0.541 - ETA: 9s - loss: 7.3816 - acc: 0.540 - ETA: 9s - loss: 7.4909 - acc: 0.533 - ETA: 9s - loss: 7.4529 - acc: 0.536 - ETA: 8s - loss: 7.5238 - acc: 0.532 - ETA: 8s - loss: 7.4802 - acc: 0.534 - ETA: 8s - loss: 7.4992 - acc: 0.533 - ETA: 8s - loss: 7.5375 - acc: 0.531 - ETA: 8s - loss: 7.5639 - acc: 0.529 - ETA: 8s - loss: 7.5773 - acc: 0.528 - ETA: 7s - loss: 7.5524 - acc: 0.530 - ETA: 7s - loss: 7.5501 - acc: 0.530 - ETA: 7s - loss: 7.5510 - acc: 0.530 - ETA: 7s - loss: 7.5574 - acc: 0.530 - ETA: 7s - loss: 7.5687 - acc: 0.529 - ETA: 6s - loss: 7.6002 - acc: 0.527 - ETA: 6s - loss: 7.6346 - acc: 0.525 - ETA: 6s - loss: 7.6381 - acc: 0.524 - ETA: 6s - loss: 7.6173 - acc: 0.526 - ETA: 6s - loss: 7.6253 - acc: 0.525 - ETA: 5s - loss: 7.6195 - acc: 0.526 - ETA: 5s - loss: 7.6444 - acc: 0.524 - ETA: 5s - loss: 7.6172 - acc: 0.526 - ETA: 5s - loss: 7.6698 - acc: 0.523 - ETA: 5s - loss: 7.7017 - acc: 0.521 - ETA: 4s - loss: 7.6986 - acc: 0.521 - ETA: 4s - loss: 7.6958 - acc: 0.521 - ETA: 4s - loss: 7.6967 - acc: 0.521 - ETA: 4s - loss: 7.7196 - acc: 0.520 - ETA: 4s - loss: 7.7307 - acc: 0.519 - ETA: 4s - loss: 7.7133 - acc: 0.520 - ETA: 3s - loss: 7.7070 - acc: 0.520 - ETA: 3s - loss: 7.7208 - acc: 0.520 - ETA: 3s - loss: 7.7014 - acc: 0.521 - ETA: 3s - loss: 7.6860 - acc: 0.522 - ETA: 3s - loss: 7.7048 - acc: 0.520 - ETA: 2s - loss: 7.7178 - acc: 0.520 - ETA: 2s - loss: 7.7060 - acc: 0.520 - ETA: 2s - loss: 7.6946 - acc: 0.521 - ETA: 2s - loss: 7.6866 - acc: 0.522 - ETA: 2s - loss: 7.6731 - acc: 0.522 - ETA: 1s - loss: 7.6745 - acc: 0.522 - ETA: 1s - loss: 7.6784 - acc: 0.522 - ETA: 1s - loss: 7.6958 - acc: 0.521 - ETA: 1s - loss: 7.7045 - acc: 0.520 - ETA: 1s - loss: 7.7262 - acc: 0.519 - ETA: 0s - loss: 7.7471 - acc: 0.518 - ETA: 0s - loss: 7.7406 - acc: 0.518 - ETA: 0s - loss: 7.7456 - acc: 0.518 - ETA: 0s - loss: 7.7628 - acc: 0.517 - ETA: 0s - loss: 7.7643 - acc: 0.517 - 13s 2ms/step - loss: 7.7751 - acc: 0.5163 - val_loss: 8.5275 - val_acc: 0.4599 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 12s - loss: 8.2202 - acc: 0.49 - ETA: 12s - loss: 7.4143 - acc: 0.54 - ETA: 12s - loss: 8.0590 - acc: 0.50 - ETA: 12s - loss: 7.9786 - acc: 0.50 - ETA: 11s - loss: 8.0591 - acc: 0.50 - ETA: 11s - loss: 8.1954 - acc: 0.49 - ETA: 11s - loss: 8.1110 - acc: 0.49 - ETA: 11s - loss: 7.9231 - acc: 0.50 - ETA: 11s - loss: 7.9741 - acc: 0.50 - ETA: 10s - loss: 8.0152 - acc: 0.50 - ETA: 10s - loss: 8.0192 - acc: 0.50 - ETA: 10s - loss: 7.9419 - acc: 0.50 - ETA: 10s - loss: 7.9648 - acc: 0.50 - ETA: 10s - loss: 7.9485 - acc: 0.50 - ETA: 9s - loss: 7.8806 - acc: 0.5093 - ETA: 9s - loss: 7.8428 - acc: 0.511 - ETA: 9s - loss: 7.8060 - acc: 0.513 - ETA: 9s - loss: 7.7305 - acc: 0.518 - ETA: 9s - loss: 7.7648 - acc: 0.516 - ETA: 9s - loss: 7.8037 - acc: 0.514 - ETA: 8s - loss: 7.8235 - acc: 0.512 - ETA: 8s - loss: 7.8635 - acc: 0.510 - ETA: 8s - loss: 7.8386 - acc: 0.511 - ETA: 8s - loss: 7.8628 - acc: 0.510 - ETA: 8s - loss: 7.8858 - acc: 0.508 - ETA: 7s - loss: 7.8553 - acc: 0.510 - ETA: 7s - loss: 7.8389 - acc: 0.511 - ETA: 7s - loss: 7.8295 - acc: 0.512 - ETA: 7s - loss: 7.7821 - acc: 0.514 - ETA: 7s - loss: 7.7700 - acc: 0.515 - ETA: 6s - loss: 7.7941 - acc: 0.514 - ETA: 6s - loss: 7.8274 - acc: 0.512 - ETA: 6s - loss: 7.8263 - acc: 0.512 - ETA: 6s - loss: 7.7762 - acc: 0.515 - ETA: 6s - loss: 7.7843 - acc: 0.514 - ETA: 5s - loss: 7.7875 - acc: 0.514 - ETA: 5s - loss: 7.7687 - acc: 0.515 - ETA: 5s - loss: 7.7814 - acc: 0.515 - ETA: 5s - loss: 7.7720 - acc: 0.515 - ETA: 5s - loss: 7.7760 - acc: 0.515 - ETA: 4s - loss: 7.7752 - acc: 0.515 - ETA: 4s - loss: 7.8127 - acc: 0.513 - ETA: 4s - loss: 7.8222 - acc: 0.512 - ETA: 4s - loss: 7.8058 - acc: 0.513 - ETA: 4s - loss: 7.7935 - acc: 0.514 - ETA: 4s - loss: 7.7818 - acc: 0.515 - ETA: 3s - loss: 7.8220 - acc: 0.512 - ETA: 3s - loss: 7.8189 - acc: 0.512 - ETA: 3s - loss: 7.8082 - acc: 0.513 - ETA: 3s - loss: 7.8424 - acc: 0.511 - ETA: 3s - loss: 7.8247 - acc: 0.512 - ETA: 2s - loss: 7.8354 - acc: 0.511 - ETA: 2s - loss: 7.8183 - acc: 0.512 - ETA: 2s - loss: 7.8467 - acc: 0.510 - ETA: 2s - loss: 7.8300 - acc: 0.512 - ETA: 2s - loss: 7.8341 - acc: 0.511 - ETA: 1s - loss: 7.8380 - acc: 0.511 - ETA: 1s - loss: 7.8141 - acc: 0.513 - ETA: 1s - loss: 7.8237 - acc: 0.512 - ETA: 1s - loss: 7.8062 - acc: 0.513 - ETA: 1s - loss: 7.7918 - acc: 0.514 - ETA: 0s - loss: 7.7935 - acc: 0.514 - ETA: 0s - loss: 7.7849 - acc: 0.515 - ETA: 0s - loss: 7.7741 - acc: 0.515 - ETA: 0s - loss: 7.7785 - acc: 0.515 - ETA: 0s - loss: 7.7607 - acc: 0.516 - 13s 2ms/step - loss: 7.7812 - acc: 0.5154 - val_loss: 8.4437 - val_acc: 0.4707 Epoch 00017: val_loss improved from 8.46532 to 8.44373, saving model to saved_models/weights.best.Xception.hdf5 Epoch 18/20 6680/6680 [==============================] - ETA: 12s - loss: 6.9308 - acc: 0.57 - ETA: 12s - loss: 7.0920 - acc: 0.56 - ETA: 12s - loss: 7.2531 - acc: 0.55 - ETA: 12s - loss: 7.3337 - acc: 0.54 - ETA: 11s - loss: 7.4788 - acc: 0.53 - ETA: 11s - loss: 7.3757 - acc: 0.54 - ETA: 11s - loss: 7.5654 - acc: 0.53 - ETA: 11s - loss: 7.7137 - acc: 0.52 - ETA: 11s - loss: 7.7341 - acc: 0.51 - ETA: 10s - loss: 7.5249 - acc: 0.53 - ETA: 10s - loss: 7.6027 - acc: 0.52 - ETA: 10s - loss: 7.5524 - acc: 0.53 - ETA: 10s - loss: 7.6038 - acc: 0.52 - ETA: 10s - loss: 7.6018 - acc: 0.52 - ETA: 9s - loss: 7.6538 - acc: 0.5240 - ETA: 9s - loss: 7.6791 - acc: 0.522 - ETA: 9s - loss: 7.7489 - acc: 0.518 - ETA: 9s - loss: 7.7840 - acc: 0.516 - ETA: 9s - loss: 7.8409 - acc: 0.512 - ETA: 8s - loss: 7.7874 - acc: 0.516 - ETA: 8s - loss: 7.8080 - acc: 0.514 - ETA: 8s - loss: 7.7828 - acc: 0.516 - ETA: 8s - loss: 7.8298 - acc: 0.513 - ETA: 8s - loss: 7.8662 - acc: 0.511 - ETA: 8s - loss: 7.8095 - acc: 0.514 - ETA: 7s - loss: 7.8315 - acc: 0.513 - ETA: 7s - loss: 7.8459 - acc: 0.512 - ETA: 7s - loss: 7.8133 - acc: 0.514 - ETA: 7s - loss: 7.8773 - acc: 0.510 - ETA: 7s - loss: 7.8672 - acc: 0.511 - ETA: 6s - loss: 7.8712 - acc: 0.511 - ETA: 6s - loss: 7.8489 - acc: 0.512 - ETA: 6s - loss: 7.8553 - acc: 0.511 - ETA: 6s - loss: 7.8518 - acc: 0.512 - ETA: 6s - loss: 7.8528 - acc: 0.512 - ETA: 5s - loss: 7.8830 - acc: 0.510 - ETA: 5s - loss: 7.8747 - acc: 0.510 - ETA: 5s - loss: 7.8626 - acc: 0.511 - ETA: 5s - loss: 7.8263 - acc: 0.513 - ETA: 5s - loss: 7.8079 - acc: 0.514 - ETA: 4s - loss: 7.8022 - acc: 0.515 - ETA: 4s - loss: 7.7815 - acc: 0.516 - ETA: 4s - loss: 7.7692 - acc: 0.517 - ETA: 4s - loss: 7.7685 - acc: 0.517 - ETA: 4s - loss: 7.7678 - acc: 0.517 - ETA: 3s - loss: 7.7951 - acc: 0.515 - ETA: 3s - loss: 7.7939 - acc: 0.515 - ETA: 3s - loss: 7.8004 - acc: 0.515 - ETA: 3s - loss: 7.8057 - acc: 0.514 - ETA: 3s - loss: 7.8043 - acc: 0.515 - ETA: 3s - loss: 7.8015 - acc: 0.515 - ETA: 2s - loss: 7.7865 - acc: 0.516 - ETA: 2s - loss: 7.7675 - acc: 0.517 - ETA: 2s - loss: 7.7579 - acc: 0.517 - ETA: 2s - loss: 7.7494 - acc: 0.518 - ETA: 2s - loss: 7.7550 - acc: 0.517 - ETA: 1s - loss: 7.7631 - acc: 0.517 - ETA: 1s - loss: 7.7655 - acc: 0.517 - ETA: 1s - loss: 7.7431 - acc: 0.518 - ETA: 1s - loss: 7.7323 - acc: 0.519 - ETA: 1s - loss: 7.7464 - acc: 0.518 - ETA: 0s - loss: 7.7385 - acc: 0.518 - ETA: 0s - loss: 7.7410 - acc: 0.518 - ETA: 0s - loss: 7.7434 - acc: 0.518 - ETA: 0s - loss: 7.7533 - acc: 0.518 - ETA: 0s - loss: 7.7359 - acc: 0.519 - 13s 2ms/step - loss: 7.7639 - acc: 0.5174 - val_loss: 8.4139 - val_acc: 0.4707 Epoch 00018: val_loss improved from 8.44373 to 8.41393, saving model to saved_models/weights.best.Xception.hdf5 Epoch 19/20 6680/6680 [==============================] - ETA: 12s - loss: 7.0920 - acc: 0.56 - ETA: 12s - loss: 7.0114 - acc: 0.56 - ETA: 12s - loss: 7.1457 - acc: 0.55 - ETA: 11s - loss: 7.5352 - acc: 0.53 - ETA: 11s - loss: 7.7044 - acc: 0.52 - ETA: 11s - loss: 7.7367 - acc: 0.52 - ETA: 11s - loss: 7.8518 - acc: 0.51 - ETA: 11s - loss: 7.6966 - acc: 0.52 - ETA: 11s - loss: 7.6294 - acc: 0.52 - ETA: 10s - loss: 7.6157 - acc: 0.52 - ETA: 10s - loss: 7.7146 - acc: 0.52 - ETA: 10s - loss: 7.7165 - acc: 0.52 - ETA: 10s - loss: 7.7180 - acc: 0.52 - ETA: 10s - loss: 7.7424 - acc: 0.51 - ETA: 9s - loss: 7.7098 - acc: 0.5213 - ETA: 9s - loss: 7.7820 - acc: 0.516 - ETA: 9s - loss: 7.7414 - acc: 0.519 - ETA: 9s - loss: 7.8038 - acc: 0.515 - ETA: 9s - loss: 7.7663 - acc: 0.517 - ETA: 8s - loss: 7.7568 - acc: 0.518 - ETA: 8s - loss: 7.7558 - acc: 0.518 - ETA: 8s - loss: 7.7873 - acc: 0.516 - ETA: 8s - loss: 7.7956 - acc: 0.515 - ETA: 8s - loss: 7.8133 - acc: 0.514 - ETA: 7s - loss: 7.7780 - acc: 0.516 - ETA: 7s - loss: 7.8074 - acc: 0.515 - ETA: 7s - loss: 7.8108 - acc: 0.514 - ETA: 7s - loss: 7.7851 - acc: 0.516 - ETA: 7s - loss: 7.8501 - acc: 0.512 - ETA: 7s - loss: 7.8219 - acc: 0.513 - ETA: 6s - loss: 7.8399 - acc: 0.512 - ETA: 6s - loss: 7.8216 - acc: 0.513 - ETA: 6s - loss: 7.8093 - acc: 0.514 - ETA: 6s - loss: 7.7882 - acc: 0.515 - ETA: 6s - loss: 7.8356 - acc: 0.512 - ETA: 5s - loss: 7.8015 - acc: 0.515 - ETA: 5s - loss: 7.7954 - acc: 0.515 - ETA: 5s - loss: 7.7642 - acc: 0.517 - ETA: 5s - loss: 7.7717 - acc: 0.516 - ETA: 5s - loss: 7.7749 - acc: 0.516 - ETA: 4s - loss: 7.7818 - acc: 0.516 - ETA: 4s - loss: 7.7232 - acc: 0.520 - ETA: 4s - loss: 7.7122 - acc: 0.520 - ETA: 4s - loss: 7.7494 - acc: 0.518 - ETA: 4s - loss: 7.7420 - acc: 0.518 - ETA: 3s - loss: 7.7278 - acc: 0.519 - ETA: 3s - loss: 7.7246 - acc: 0.520 - ETA: 3s - loss: 7.7249 - acc: 0.520 - ETA: 3s - loss: 7.7350 - acc: 0.519 - ETA: 3s - loss: 7.7286 - acc: 0.519 - ETA: 3s - loss: 7.7319 - acc: 0.519 - ETA: 2s - loss: 7.7196 - acc: 0.520 - ETA: 2s - loss: 7.7321 - acc: 0.519 - ETA: 2s - loss: 7.7351 - acc: 0.519 - ETA: 2s - loss: 7.7410 - acc: 0.519 - ETA: 2s - loss: 7.7553 - acc: 0.518 - ETA: 1s - loss: 7.7691 - acc: 0.517 - ETA: 1s - loss: 7.7658 - acc: 0.517 - ETA: 1s - loss: 7.7598 - acc: 0.518 - ETA: 1s - loss: 7.7514 - acc: 0.518 - ETA: 1s - loss: 7.7274 - acc: 0.520 - ETA: 0s - loss: 7.7379 - acc: 0.519 - ETA: 0s - loss: 7.7405 - acc: 0.519 - ETA: 0s - loss: 7.7354 - acc: 0.519 - ETA: 0s - loss: 7.7403 - acc: 0.519 - ETA: 0s - loss: 7.7452 - acc: 0.518 - 13s 2ms/step - loss: 7.7538 - acc: 0.5184 - val_loss: 8.3968 - val_acc: 0.4707 Epoch 00019: val_loss improved from 8.41393 to 8.39676, saving model to saved_models/weights.best.Xception.hdf5 Epoch 20/20 6680/6680 [==============================] - ETA: 12s - loss: 9.3485 - acc: 0.42 - ETA: 12s - loss: 8.3008 - acc: 0.48 - ETA: 12s - loss: 8.1128 - acc: 0.49 - ETA: 11s - loss: 7.7770 - acc: 0.51 - ETA: 11s - loss: 7.7689 - acc: 0.51 - ETA: 11s - loss: 7.6830 - acc: 0.52 - ETA: 11s - loss: 7.6446 - acc: 0.52 - ETA: 11s - loss: 7.6964 - acc: 0.52 - ETA: 10s - loss: 7.6292 - acc: 0.52 - ETA: 10s - loss: 7.6077 - acc: 0.52 - ETA: 10s - loss: 7.5902 - acc: 0.52 - ETA: 10s - loss: 7.7098 - acc: 0.52 - ETA: 10s - loss: 7.8235 - acc: 0.51 - ETA: 9s - loss: 7.7943 - acc: 0.5164 - ETA: 9s - loss: 7.7582 - acc: 0.518 - ETA: 9s - loss: 7.7065 - acc: 0.521 - ETA: 9s - loss: 7.7367 - acc: 0.520 - ETA: 9s - loss: 7.6919 - acc: 0.522 - ETA: 9s - loss: 7.6434 - acc: 0.525 - ETA: 8s - loss: 7.6642 - acc: 0.524 - ETA: 8s - loss: 7.6523 - acc: 0.525 - ETA: 8s - loss: 7.6488 - acc: 0.525 - ETA: 8s - loss: 7.6553 - acc: 0.524 - ETA: 8s - loss: 7.6453 - acc: 0.525 - ETA: 7s - loss: 7.6232 - acc: 0.526 - ETA: 7s - loss: 7.6151 - acc: 0.527 - ETA: 7s - loss: 7.6196 - acc: 0.527 - ETA: 7s - loss: 7.6411 - acc: 0.525 - ETA: 7s - loss: 7.6110 - acc: 0.527 - ETA: 7s - loss: 7.6154 - acc: 0.527 - ETA: 6s - loss: 7.6141 - acc: 0.527 - ETA: 6s - loss: 7.6078 - acc: 0.527 - ETA: 6s - loss: 7.6410 - acc: 0.525 - ETA: 6s - loss: 7.6658 - acc: 0.524 - ETA: 6s - loss: 7.6724 - acc: 0.523 - ETA: 5s - loss: 7.6429 - acc: 0.525 - ETA: 5s - loss: 7.6019 - acc: 0.528 - ETA: 5s - loss: 7.6103 - acc: 0.527 - ETA: 5s - loss: 7.5970 - acc: 0.528 - ETA: 5s - loss: 7.6126 - acc: 0.527 - ETA: 4s - loss: 7.6549 - acc: 0.524 - ETA: 4s - loss: 7.6876 - acc: 0.522 - ETA: 4s - loss: 7.6662 - acc: 0.524 - ETA: 4s - loss: 7.6678 - acc: 0.523 - ETA: 4s - loss: 7.6920 - acc: 0.522 - ETA: 3s - loss: 7.7104 - acc: 0.521 - ETA: 3s - loss: 7.7113 - acc: 0.520 - ETA: 3s - loss: 7.7252 - acc: 0.520 - ETA: 3s - loss: 7.7057 - acc: 0.521 - ETA: 3s - loss: 7.6999 - acc: 0.521 - ETA: 3s - loss: 7.6975 - acc: 0.521 - ETA: 2s - loss: 7.7144 - acc: 0.520 - ETA: 2s - loss: 7.7148 - acc: 0.520 - ETA: 2s - loss: 7.6943 - acc: 0.521 - ETA: 2s - loss: 7.7009 - acc: 0.521 - ETA: 2s - loss: 7.7189 - acc: 0.520 - ETA: 1s - loss: 7.7135 - acc: 0.520 - ETA: 1s - loss: 7.7222 - acc: 0.520 - ETA: 1s - loss: 7.7088 - acc: 0.521 - ETA: 1s - loss: 7.7120 - acc: 0.520 - ETA: 1s - loss: 7.7124 - acc: 0.520 - ETA: 0s - loss: 7.7180 - acc: 0.520 - ETA: 0s - loss: 7.7346 - acc: 0.519 - ETA: 0s - loss: 7.7649 - acc: 0.517 - ETA: 0s - loss: 7.7595 - acc: 0.517 - ETA: 0s - loss: 7.7518 - acc: 0.518 - 13s 2ms/step - loss: 7.7434 - acc: 0.5189 - val_loss: 8.4149 - val_acc: 0.4707 Epoch 00020: val_loss did not improve we are at Xception_model1 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 45s - loss: 5.2801 - acc: 0.0000e+ - ETA: 24s - loss: 8.0695 - acc: 0.0350 - ETA: 17s - loss: 8.2098 - acc: 0.12 - ETA: 13s - loss: 8.2469 - acc: 0.17 - ETA: 11s - loss: 8.2045 - acc: 0.20 - ETA: 9s - loss: 8.2486 - acc: 0.2167 - ETA: 8s - loss: 8.1572 - acc: 0.240 - ETA: 8s - loss: 8.0634 - acc: 0.257 - ETA: 7s - loss: 7.8287 - acc: 0.278 - ETA: 6s - loss: 7.6610 - acc: 0.298 - ETA: 6s - loss: 7.5203 - acc: 0.308 - ETA: 6s - loss: 7.3759 - acc: 0.320 - ETA: 5s - loss: 7.1330 - acc: 0.334 - ETA: 5s - loss: 6.9769 - acc: 0.344 - ETA: 5s - loss: 6.7403 - acc: 0.365 - ETA: 4s - loss: 6.5688 - acc: 0.376 - ETA: 4s - loss: 6.4008 - acc: 0.391 - ETA: 4s - loss: 6.2243 - acc: 0.406 - ETA: 4s - loss: 6.0739 - acc: 0.413 - ETA: 4s - loss: 5.9292 - acc: 0.424 - ETA: 4s - loss: 5.7633 - acc: 0.432 - ETA: 3s - loss: 5.6202 - acc: 0.442 - ETA: 3s - loss: 5.5124 - acc: 0.449 - ETA: 3s - loss: 5.3629 - acc: 0.457 - ETA: 3s - loss: 5.2242 - acc: 0.464 - ETA: 3s - loss: 5.1018 - acc: 0.472 - ETA: 3s - loss: 4.9946 - acc: 0.479 - ETA: 3s - loss: 4.8890 - acc: 0.486 - ETA: 3s - loss: 4.7903 - acc: 0.491 - ETA: 2s - loss: 4.6780 - acc: 0.499 - ETA: 2s - loss: 4.5837 - acc: 0.505 - ETA: 2s - loss: 4.4879 - acc: 0.512 - ETA: 2s - loss: 4.4202 - acc: 0.517 - ETA: 2s - loss: 4.3418 - acc: 0.520 - ETA: 2s - loss: 4.2573 - acc: 0.527 - ETA: 2s - loss: 4.1873 - acc: 0.530 - ETA: 2s - loss: 4.1238 - acc: 0.533 - ETA: 2s - loss: 4.0539 - acc: 0.540 - ETA: 2s - loss: 3.9702 - acc: 0.547 - ETA: 1s - loss: 3.9194 - acc: 0.548 - ETA: 1s - loss: 3.8589 - acc: 0.552 - ETA: 1s - loss: 3.7963 - acc: 0.556 - ETA: 1s - loss: 3.7550 - acc: 0.558 - ETA: 1s - loss: 3.7038 - acc: 0.561 - ETA: 1s - loss: 3.6497 - acc: 0.566 - ETA: 1s - loss: 3.6017 - acc: 0.569 - ETA: 1s - loss: 3.5474 - acc: 0.574 - ETA: 1s - loss: 3.4969 - acc: 0.577 - ETA: 1s - loss: 3.4543 - acc: 0.580 - ETA: 1s - loss: 3.4062 - acc: 0.584 - ETA: 1s - loss: 3.3688 - acc: 0.587 - ETA: 1s - loss: 3.3231 - acc: 0.591 - ETA: 0s - loss: 3.2804 - acc: 0.593 - ETA: 0s - loss: 3.2461 - acc: 0.596 - ETA: 0s - loss: 3.2158 - acc: 0.597 - ETA: 0s - loss: 3.1817 - acc: 0.599 - ETA: 0s - loss: 3.1504 - acc: 0.601 - ETA: 0s - loss: 3.1182 - acc: 0.604 - ETA: 0s - loss: 3.0870 - acc: 0.605 - ETA: 0s - loss: 3.0510 - acc: 0.608 - ETA: 0s - loss: 3.0216 - acc: 0.612 - ETA: 0s - loss: 2.9956 - acc: 0.614 - ETA: 0s - loss: 2.9696 - acc: 0.616 - ETA: 0s - loss: 2.9379 - acc: 0.619 - ETA: 0s - loss: 2.9097 - acc: 0.621 - ETA: 0s - loss: 2.8823 - acc: 0.623 - 5s 727us/step - loss: 2.8593 - acc: 0.6253 - val_loss: 1.0026 - val_acc: 0.7856 Epoch 00001: val_loss improved from inf to 1.00256, saving model to saved_models/weights.best.Xception1.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 3s - loss: 0.5478 - acc: 0.880 - ETA: 3s - loss: 0.5919 - acc: 0.875 - ETA: 3s - loss: 0.5113 - acc: 0.873 - ETA: 3s - loss: 0.4758 - acc: 0.877 - ETA: 3s - loss: 0.5014 - acc: 0.878 - ETA: 3s - loss: 0.4762 - acc: 0.888 - ETA: 3s - loss: 0.4762 - acc: 0.890 - ETA: 3s - loss: 0.4522 - acc: 0.887 - ETA: 3s - loss: 0.4508 - acc: 0.886 - ETA: 3s - loss: 0.5060 - acc: 0.880 - ETA: 3s - loss: 0.5260 - acc: 0.876 - ETA: 3s - loss: 0.5156 - acc: 0.877 - ETA: 3s - loss: 0.4964 - acc: 0.879 - ETA: 3s - loss: 0.4946 - acc: 0.880 - ETA: 3s - loss: 0.5189 - acc: 0.876 - ETA: 2s - loss: 0.5262 - acc: 0.878 - ETA: 2s - loss: 0.5326 - acc: 0.878 - ETA: 2s - loss: 0.5288 - acc: 0.878 - ETA: 2s - loss: 0.5504 - acc: 0.874 - ETA: 2s - loss: 0.5448 - acc: 0.874 - ETA: 2s - loss: 0.5302 - acc: 0.877 - ETA: 2s - loss: 0.5320 - acc: 0.877 - ETA: 2s - loss: 0.5362 - acc: 0.875 - ETA: 2s - loss: 0.5607 - acc: 0.869 - ETA: 2s - loss: 0.5644 - acc: 0.868 - ETA: 2s - loss: 0.5605 - acc: 0.868 - ETA: 2s - loss: 0.5691 - acc: 0.868 - ETA: 2s - loss: 0.5740 - acc: 0.867 - ETA: 2s - loss: 0.5663 - acc: 0.869 - ETA: 2s - loss: 0.5587 - acc: 0.870 - ETA: 2s - loss: 0.5646 - acc: 0.868 - ETA: 2s - loss: 0.5647 - acc: 0.869 - ETA: 1s - loss: 0.5617 - acc: 0.870 - ETA: 1s - loss: 0.5627 - acc: 0.870 - ETA: 1s - loss: 0.5575 - acc: 0.871 - ETA: 1s - loss: 0.5610 - acc: 0.870 - ETA: 1s - loss: 0.5612 - acc: 0.870 - ETA: 1s - loss: 0.5537 - acc: 0.871 - ETA: 1s - loss: 0.5593 - acc: 0.870 - ETA: 1s - loss: 0.5662 - acc: 0.870 - ETA: 1s - loss: 0.5671 - acc: 0.870 - ETA: 1s - loss: 0.5673 - acc: 0.870 - ETA: 1s - loss: 0.5722 - acc: 0.869 - ETA: 1s - loss: 0.5718 - acc: 0.868 - ETA: 1s - loss: 0.5693 - acc: 0.867 - ETA: 1s - loss: 0.5728 - acc: 0.866 - ETA: 1s - loss: 0.5746 - acc: 0.865 - ETA: 1s - loss: 0.5719 - acc: 0.866 - ETA: 1s - loss: 0.5721 - acc: 0.865 - ETA: 0s - loss: 0.5690 - acc: 0.865 - ETA: 0s - loss: 0.5712 - acc: 0.865 - ETA: 0s - loss: 0.5716 - acc: 0.865 - ETA: 0s - loss: 0.5697 - acc: 0.865 - ETA: 0s - loss: 0.5701 - acc: 0.865 - ETA: 0s - loss: 0.5731 - acc: 0.865 - ETA: 0s - loss: 0.5765 - acc: 0.864 - ETA: 0s - loss: 0.5776 - acc: 0.864 - ETA: 0s - loss: 0.5798 - acc: 0.864 - ETA: 0s - loss: 0.5820 - acc: 0.864 - ETA: 0s - loss: 0.5843 - acc: 0.864 - ETA: 0s - loss: 0.5883 - acc: 0.863 - ETA: 0s - loss: 0.5876 - acc: 0.863 - ETA: 0s - loss: 0.5898 - acc: 0.863 - ETA: 0s - loss: 0.5891 - acc: 0.864 - ETA: 0s - loss: 0.5953 - acc: 0.863 - ETA: 0s - loss: 0.5964 - acc: 0.863 - 4s 626us/step - loss: 0.5976 - acc: 0.8633 - val_loss: 1.1634 - val_acc: 0.7749 Epoch 00002: val_loss did not improve Epoch 3/20 6680/6680 [==============================] - ETA: 3s - loss: 0.4153 - acc: 0.910 - ETA: 3s - loss: 0.3216 - acc: 0.925 - ETA: 3s - loss: 0.3625 - acc: 0.923 - ETA: 3s - loss: 0.2857 - acc: 0.937 - ETA: 3s - loss: 0.3149 - acc: 0.932 - ETA: 3s - loss: 0.3073 - acc: 0.931 - ETA: 3s - loss: 0.2843 - acc: 0.930 - ETA: 3s - loss: 0.2920 - acc: 0.927 - ETA: 3s - loss: 0.2977 - acc: 0.924 - ETA: 3s - loss: 0.2838 - acc: 0.928 - ETA: 3s - loss: 0.2874 - acc: 0.924 - ETA: 3s - loss: 0.2883 - acc: 0.920 - ETA: 3s - loss: 0.2862 - acc: 0.919 - ETA: 3s - loss: 0.2944 - acc: 0.918 - ETA: 3s - loss: 0.2938 - acc: 0.916 - ETA: 2s - loss: 0.2861 - acc: 0.917 - ETA: 2s - loss: 0.2883 - acc: 0.917 - ETA: 2s - loss: 0.2840 - acc: 0.918 - ETA: 2s - loss: 0.2740 - acc: 0.921 - ETA: 2s - loss: 0.2837 - acc: 0.920 - ETA: 2s - loss: 0.2866 - acc: 0.920 - ETA: 2s - loss: 0.2834 - acc: 0.920 - ETA: 2s - loss: 0.2866 - acc: 0.919 - ETA: 2s - loss: 0.2904 - acc: 0.916 - ETA: 2s - loss: 0.2914 - acc: 0.918 - ETA: 2s - loss: 0.2903 - acc: 0.918 - ETA: 2s - loss: 0.2877 - acc: 0.919 - ETA: 2s - loss: 0.2864 - acc: 0.921 - ETA: 2s - loss: 0.2770 - acc: 0.924 - ETA: 2s - loss: 0.2706 - acc: 0.926 - ETA: 2s - loss: 0.2688 - acc: 0.926 - ETA: 2s - loss: 0.2709 - acc: 0.926 - ETA: 1s - loss: 0.2696 - acc: 0.927 - ETA: 1s - loss: 0.2668 - acc: 0.928 - ETA: 1s - loss: 0.2673 - acc: 0.928 - ETA: 1s - loss: 0.2680 - acc: 0.928 - ETA: 1s - loss: 0.2757 - acc: 0.927 - ETA: 1s - loss: 0.2750 - acc: 0.927 - ETA: 1s - loss: 0.2782 - acc: 0.927 - ETA: 1s - loss: 0.2801 - acc: 0.927 - ETA: 1s - loss: 0.2805 - acc: 0.927 - ETA: 1s - loss: 0.2824 - acc: 0.928 - ETA: 1s - loss: 0.2793 - acc: 0.929 - ETA: 1s - loss: 0.2755 - acc: 0.930 - ETA: 1s - loss: 0.2756 - acc: 0.929 - ETA: 1s - loss: 0.2731 - acc: 0.929 - ETA: 1s - loss: 0.2727 - acc: 0.930 - ETA: 1s - loss: 0.2790 - acc: 0.928 - ETA: 1s - loss: 0.2808 - acc: 0.928 - ETA: 0s - loss: 0.2780 - acc: 0.928 - ETA: 0s - loss: 0.2848 - acc: 0.927 - ETA: 0s - loss: 0.2853 - acc: 0.927 - ETA: 0s - loss: 0.2871 - acc: 0.927 - ETA: 0s - loss: 0.2888 - acc: 0.926 - ETA: 0s - loss: 0.2884 - acc: 0.926 - ETA: 0s - loss: 0.2904 - acc: 0.926 - ETA: 0s - loss: 0.2909 - acc: 0.925 - ETA: 0s - loss: 0.2939 - acc: 0.925 - ETA: 0s - loss: 0.2987 - acc: 0.924 - ETA: 0s - loss: 0.3042 - acc: 0.923 - ETA: 0s - loss: 0.3026 - acc: 0.923 - ETA: 0s - loss: 0.2997 - acc: 0.924 - ETA: 0s - loss: 0.3032 - acc: 0.924 - ETA: 0s - loss: 0.3039 - acc: 0.924 - ETA: 0s - loss: 0.3024 - acc: 0.924 - ETA: 0s - loss: 0.3028 - acc: 0.923 - 4s 626us/step - loss: 0.3004 - acc: 0.9247 - val_loss: 1.1307 - val_acc: 0.7940 Epoch 00003: val_loss did not improve Epoch 4/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0821 - acc: 0.950 - ETA: 3s - loss: 0.1096 - acc: 0.950 - ETA: 3s - loss: 0.1156 - acc: 0.953 - ETA: 3s - loss: 0.1406 - acc: 0.947 - ETA: 3s - loss: 0.1665 - acc: 0.948 - ETA: 3s - loss: 0.1560 - acc: 0.953 - ETA: 3s - loss: 0.1353 - acc: 0.960 - ETA: 3s - loss: 0.1378 - acc: 0.961 - ETA: 3s - loss: 0.1404 - acc: 0.961 - ETA: 3s - loss: 0.1441 - acc: 0.960 - ETA: 3s - loss: 0.1796 - acc: 0.955 - ETA: 3s - loss: 0.1679 - acc: 0.958 - ETA: 3s - loss: 0.1660 - acc: 0.958 - ETA: 3s - loss: 0.1599 - acc: 0.959 - ETA: 3s - loss: 0.1589 - acc: 0.959 - ETA: 2s - loss: 0.1587 - acc: 0.958 - ETA: 2s - loss: 0.1591 - acc: 0.957 - ETA: 2s - loss: 0.1561 - acc: 0.957 - ETA: 2s - loss: 0.1534 - acc: 0.957 - ETA: 2s - loss: 0.1564 - acc: 0.956 - ETA: 2s - loss: 0.1651 - acc: 0.955 - ETA: 2s - loss: 0.1592 - acc: 0.956 - ETA: 2s - loss: 0.1605 - acc: 0.956 - ETA: 2s - loss: 0.1567 - acc: 0.956 - ETA: 2s - loss: 0.1567 - acc: 0.956 - ETA: 2s - loss: 0.1572 - acc: 0.955 - ETA: 2s - loss: 0.1632 - acc: 0.953 - ETA: 2s - loss: 0.1688 - acc: 0.951 - ETA: 2s - loss: 0.1656 - acc: 0.952 - ETA: 2s - loss: 0.1745 - acc: 0.951 - ETA: 2s - loss: 0.1752 - acc: 0.950 - ETA: 2s - loss: 0.1750 - acc: 0.950 - ETA: 1s - loss: 0.1726 - acc: 0.951 - ETA: 1s - loss: 0.1716 - acc: 0.950 - ETA: 1s - loss: 0.1707 - acc: 0.950 - ETA: 1s - loss: 0.1687 - acc: 0.950 - ETA: 1s - loss: 0.1674 - acc: 0.951 - ETA: 1s - loss: 0.1677 - acc: 0.950 - ETA: 1s - loss: 0.1659 - acc: 0.951 - ETA: 1s - loss: 0.1691 - acc: 0.950 - ETA: 1s - loss: 0.1661 - acc: 0.950 - ETA: 1s - loss: 0.1648 - acc: 0.951 - ETA: 1s - loss: 0.1646 - acc: 0.950 - ETA: 1s - loss: 0.1641 - acc: 0.951 - ETA: 1s - loss: 0.1641 - acc: 0.950 - ETA: 1s - loss: 0.1620 - acc: 0.951 - ETA: 1s - loss: 0.1598 - acc: 0.951 - ETA: 1s - loss: 0.1627 - acc: 0.951 - ETA: 1s - loss: 0.1629 - acc: 0.951 - ETA: 0s - loss: 0.1630 - acc: 0.950 - ETA: 0s - loss: 0.1664 - acc: 0.950 - ETA: 0s - loss: 0.1691 - acc: 0.950 - ETA: 0s - loss: 0.1715 - acc: 0.950 - ETA: 0s - loss: 0.1716 - acc: 0.950 - ETA: 0s - loss: 0.1697 - acc: 0.950 - ETA: 0s - loss: 0.1686 - acc: 0.950 - ETA: 0s - loss: 0.1662 - acc: 0.951 - ETA: 0s - loss: 0.1646 - acc: 0.952 - ETA: 0s - loss: 0.1637 - acc: 0.952 - ETA: 0s - loss: 0.1712 - acc: 0.951 - ETA: 0s - loss: 0.1702 - acc: 0.950 - ETA: 0s - loss: 0.1732 - acc: 0.950 - ETA: 0s - loss: 0.1800 - acc: 0.948 - ETA: 0s - loss: 0.1836 - acc: 0.948 - ETA: 0s - loss: 0.1862 - acc: 0.947 - ETA: 0s - loss: 0.1863 - acc: 0.947 - 4s 622us/step - loss: 0.1854 - acc: 0.9481 - val_loss: 1.2503 - val_acc: 0.8108 Epoch 00004: val_loss did not improve Epoch 5/20 6680/6680 [==============================] - ETA: 3s - loss: 0.1240 - acc: 0.940 - ETA: 3s - loss: 0.0714 - acc: 0.965 - ETA: 3s - loss: 0.0892 - acc: 0.963 - ETA: 3s - loss: 0.0759 - acc: 0.965 - ETA: 3s - loss: 0.0729 - acc: 0.966 - ETA: 3s - loss: 0.0772 - acc: 0.965 - ETA: 3s - loss: 0.0806 - acc: 0.965 - ETA: 3s - loss: 0.0756 - acc: 0.967 - ETA: 3s - loss: 0.0684 - acc: 0.971 - ETA: 3s - loss: 0.0670 - acc: 0.973 - ETA: 3s - loss: 0.0624 - acc: 0.975 - ETA: 3s - loss: 0.0634 - acc: 0.975 - ETA: 3s - loss: 0.0607 - acc: 0.976 - ETA: 3s - loss: 0.0568 - acc: 0.978 - ETA: 3s - loss: 0.0551 - acc: 0.978 - ETA: 2s - loss: 0.0545 - acc: 0.978 - ETA: 2s - loss: 0.0650 - acc: 0.976 - ETA: 2s - loss: 0.0762 - acc: 0.975 - ETA: 2s - loss: 0.0800 - acc: 0.974 - ETA: 2s - loss: 0.0937 - acc: 0.972 - ETA: 2s - loss: 0.0921 - acc: 0.972 - ETA: 2s - loss: 0.0921 - acc: 0.973 - ETA: 2s - loss: 0.0923 - acc: 0.973 - ETA: 2s - loss: 0.0929 - acc: 0.973 - ETA: 2s - loss: 0.0948 - acc: 0.973 - ETA: 2s - loss: 0.0999 - acc: 0.973 - ETA: 2s - loss: 0.0994 - acc: 0.973 - ETA: 2s - loss: 0.1001 - acc: 0.972 - ETA: 2s - loss: 0.1028 - acc: 0.972 - ETA: 2s - loss: 0.0999 - acc: 0.973 - ETA: 2s - loss: 0.1034 - acc: 0.972 - ETA: 2s - loss: 0.1152 - acc: 0.972 - ETA: 1s - loss: 0.1144 - acc: 0.971 - ETA: 1s - loss: 0.1126 - acc: 0.972 - ETA: 1s - loss: 0.1112 - acc: 0.972 - ETA: 1s - loss: 0.1106 - acc: 0.972 - ETA: 1s - loss: 0.1090 - acc: 0.972 - ETA: 1s - loss: 0.1076 - acc: 0.973 - ETA: 1s - loss: 0.1055 - acc: 0.973 - ETA: 1s - loss: 0.1068 - acc: 0.973 - ETA: 1s - loss: 0.1085 - acc: 0.973 - ETA: 1s - loss: 0.1083 - acc: 0.972 - ETA: 1s - loss: 0.1077 - acc: 0.973 - ETA: 1s - loss: 0.1082 - acc: 0.973 - ETA: 1s - loss: 0.1071 - acc: 0.973 - ETA: 1s - loss: 0.1055 - acc: 0.973 - ETA: 1s - loss: 0.1054 - acc: 0.973 - ETA: 1s - loss: 0.1053 - acc: 0.974 - ETA: 1s - loss: 0.1091 - acc: 0.973 - ETA: 0s - loss: 0.1125 - acc: 0.972 - ETA: 0s - loss: 0.1150 - acc: 0.972 - ETA: 0s - loss: 0.1154 - acc: 0.971 - ETA: 0s - loss: 0.1158 - acc: 0.971 - ETA: 0s - loss: 0.1143 - acc: 0.971 - ETA: 0s - loss: 0.1158 - acc: 0.971 - ETA: 0s - loss: 0.1145 - acc: 0.971 - ETA: 0s - loss: 0.1147 - acc: 0.971 - ETA: 0s - loss: 0.1140 - acc: 0.971 - ETA: 0s - loss: 0.1139 - acc: 0.971 - ETA: 0s - loss: 0.1133 - acc: 0.971 - ETA: 0s - loss: 0.1148 - acc: 0.970 - ETA: 0s - loss: 0.1169 - acc: 0.970 - ETA: 0s - loss: 0.1175 - acc: 0.970 - ETA: 0s - loss: 0.1165 - acc: 0.970 - ETA: 0s - loss: 0.1164 - acc: 0.970 - ETA: 0s - loss: 0.1154 - acc: 0.970 - 4s 630us/step - loss: 0.1177 - acc: 0.9701 - val_loss: 1.0441 - val_acc: 0.8347 Epoch 00005: val_loss did not improve Epoch 6/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0096 - acc: 0.990 - ETA: 3s - loss: 0.0364 - acc: 0.990 - ETA: 3s - loss: 0.0387 - acc: 0.990 - ETA: 3s - loss: 0.0305 - acc: 0.992 - ETA: 3s - loss: 0.0277 - acc: 0.990 - ETA: 3s - loss: 0.0424 - acc: 0.988 - ETA: 3s - loss: 0.0367 - acc: 0.990 - ETA: 3s - loss: 0.0323 - acc: 0.991 - ETA: 3s - loss: 0.0319 - acc: 0.990 - ETA: 3s - loss: 0.0545 - acc: 0.988 - ETA: 3s - loss: 0.0512 - acc: 0.988 - ETA: 3s - loss: 0.0471 - acc: 0.989 - ETA: 3s - loss: 0.0564 - acc: 0.989 - ETA: 3s - loss: 0.0528 - acc: 0.990 - ETA: 3s - loss: 0.0517 - acc: 0.990 - ETA: 2s - loss: 0.0542 - acc: 0.989 - ETA: 2s - loss: 0.0512 - acc: 0.990 - ETA: 2s - loss: 0.0487 - acc: 0.990 - ETA: 2s - loss: 0.0467 - acc: 0.991 - ETA: 2s - loss: 0.0454 - acc: 0.990 - ETA: 2s - loss: 0.0434 - acc: 0.991 - ETA: 2s - loss: 0.0415 - acc: 0.991 - ETA: 2s - loss: 0.0469 - acc: 0.991 - ETA: 2s - loss: 0.0494 - acc: 0.990 - ETA: 2s - loss: 0.0482 - acc: 0.990 - ETA: 2s - loss: 0.0482 - acc: 0.990 - ETA: 2s - loss: 0.0515 - acc: 0.989 - ETA: 2s - loss: 0.0555 - acc: 0.989 - ETA: 2s - loss: 0.0548 - acc: 0.989 - ETA: 2s - loss: 0.0561 - acc: 0.989 - ETA: 2s - loss: 0.0568 - acc: 0.988 - ETA: 2s - loss: 0.0579 - acc: 0.987 - ETA: 1s - loss: 0.0568 - acc: 0.987 - ETA: 1s - loss: 0.0561 - acc: 0.987 - ETA: 1s - loss: 0.0554 - acc: 0.987 - ETA: 1s - loss: 0.0541 - acc: 0.987 - ETA: 1s - loss: 0.0608 - acc: 0.986 - ETA: 1s - loss: 0.0598 - acc: 0.986 - ETA: 1s - loss: 0.0587 - acc: 0.986 - ETA: 1s - loss: 0.0617 - acc: 0.986 - ETA: 1s - loss: 0.0620 - acc: 0.986 - ETA: 1s - loss: 0.0615 - acc: 0.985 - ETA: 1s - loss: 0.0628 - acc: 0.985 - ETA: 1s - loss: 0.0664 - acc: 0.983 - ETA: 1s - loss: 0.0668 - acc: 0.983 - ETA: 1s - loss: 0.0671 - acc: 0.983 - ETA: 1s - loss: 0.0659 - acc: 0.983 - ETA: 1s - loss: 0.0650 - acc: 0.983 - ETA: 1s - loss: 0.0651 - acc: 0.983 - ETA: 0s - loss: 0.0685 - acc: 0.982 - ETA: 0s - loss: 0.0725 - acc: 0.982 - ETA: 0s - loss: 0.0723 - acc: 0.981 - ETA: 0s - loss: 0.0739 - acc: 0.981 - ETA: 0s - loss: 0.0760 - acc: 0.981 - ETA: 0s - loss: 0.0777 - acc: 0.980 - ETA: 0s - loss: 0.0774 - acc: 0.980 - ETA: 0s - loss: 0.0767 - acc: 0.980 - ETA: 0s - loss: 0.0810 - acc: 0.980 - ETA: 0s - loss: 0.0804 - acc: 0.980 - ETA: 0s - loss: 0.0818 - acc: 0.980 - ETA: 0s - loss: 0.0819 - acc: 0.980 - ETA: 0s - loss: 0.0809 - acc: 0.980 - ETA: 0s - loss: 0.0799 - acc: 0.981 - ETA: 0s - loss: 0.0788 - acc: 0.981 - ETA: 0s - loss: 0.0785 - acc: 0.981 - ETA: 0s - loss: 0.0773 - acc: 0.981 - 4s 624us/step - loss: 0.0765 - acc: 0.9819 - val_loss: 1.0769 - val_acc: 0.8240 Epoch 00006: val_loss did not improve Epoch 7/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0014 - acc: 1.000 - ETA: 3s - loss: 0.0012 - acc: 1.000 - ETA: 3s - loss: 0.0016 - acc: 1.000 - ETA: 3s - loss: 0.0014 - acc: 1.000 - ETA: 3s - loss: 0.0017 - acc: 1.000 - ETA: 3s - loss: 0.0017 - acc: 1.000 - ETA: 3s - loss: 0.0323 - acc: 0.997 - ETA: 3s - loss: 0.0296 - acc: 0.997 - ETA: 3s - loss: 0.0269 - acc: 0.997 - ETA: 3s - loss: 0.0265 - acc: 0.997 - ETA: 3s - loss: 0.0245 - acc: 0.997 - ETA: 3s - loss: 0.0264 - acc: 0.996 - ETA: 3s - loss: 0.0246 - acc: 0.996 - ETA: 3s - loss: 0.0272 - acc: 0.996 - ETA: 3s - loss: 0.0342 - acc: 0.994 - ETA: 2s - loss: 0.0321 - acc: 0.995 - ETA: 2s - loss: 0.0323 - acc: 0.994 - ETA: 2s - loss: 0.0314 - acc: 0.993 - ETA: 2s - loss: 0.0305 - acc: 0.993 - ETA: 2s - loss: 0.0376 - acc: 0.993 - ETA: 2s - loss: 0.0385 - acc: 0.992 - ETA: 2s - loss: 0.0373 - acc: 0.992 - ETA: 2s - loss: 0.0361 - acc: 0.993 - ETA: 2s - loss: 0.0352 - acc: 0.993 - ETA: 2s - loss: 0.0374 - acc: 0.992 - ETA: 2s - loss: 0.0373 - acc: 0.991 - ETA: 2s - loss: 0.0385 - acc: 0.991 - ETA: 2s - loss: 0.0394 - acc: 0.990 - ETA: 2s - loss: 0.0416 - acc: 0.990 - ETA: 2s - loss: 0.0472 - acc: 0.990 - ETA: 2s - loss: 0.0460 - acc: 0.990 - ETA: 2s - loss: 0.0464 - acc: 0.990 - ETA: 1s - loss: 0.0479 - acc: 0.990 - ETA: 1s - loss: 0.0490 - acc: 0.989 - ETA: 1s - loss: 0.0480 - acc: 0.989 - ETA: 1s - loss: 0.0467 - acc: 0.990 - ETA: 1s - loss: 0.0500 - acc: 0.990 - ETA: 1s - loss: 0.0489 - acc: 0.990 - ETA: 1s - loss: 0.0488 - acc: 0.990 - ETA: 1s - loss: 0.0496 - acc: 0.990 - ETA: 1s - loss: 0.0490 - acc: 0.990 - ETA: 1s - loss: 0.0487 - acc: 0.990 - ETA: 1s - loss: 0.0488 - acc: 0.990 - ETA: 1s - loss: 0.0497 - acc: 0.989 - ETA: 1s - loss: 0.0544 - acc: 0.989 - ETA: 1s - loss: 0.0554 - acc: 0.988 - ETA: 1s - loss: 0.0556 - acc: 0.988 - ETA: 1s - loss: 0.0566 - acc: 0.988 - ETA: 1s - loss: 0.0571 - acc: 0.988 - ETA: 0s - loss: 0.0568 - acc: 0.988 - ETA: 0s - loss: 0.0562 - acc: 0.988 - ETA: 0s - loss: 0.0584 - acc: 0.987 - ETA: 0s - loss: 0.0580 - acc: 0.987 - ETA: 0s - loss: 0.0599 - acc: 0.986 - ETA: 0s - loss: 0.0606 - acc: 0.986 - ETA: 0s - loss: 0.0629 - acc: 0.986 - ETA: 0s - loss: 0.0647 - acc: 0.986 - ETA: 0s - loss: 0.0643 - acc: 0.986 - ETA: 0s - loss: 0.0638 - acc: 0.986 - ETA: 0s - loss: 0.0651 - acc: 0.986 - ETA: 0s - loss: 0.0641 - acc: 0.986 - ETA: 0s - loss: 0.0637 - acc: 0.986 - ETA: 0s - loss: 0.0640 - acc: 0.986 - ETA: 0s - loss: 0.0633 - acc: 0.986 - ETA: 0s - loss: 0.0625 - acc: 0.986 - ETA: 0s - loss: 0.0616 - acc: 0.986 - 4s 621us/step - loss: 0.0639 - acc: 0.9861 - val_loss: 1.2159 - val_acc: 0.8240 Epoch 00007: val_loss did not improve Epoch 8/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0401 - acc: 0.980 - ETA: 3s - loss: 0.0283 - acc: 0.990 - ETA: 3s - loss: 0.0856 - acc: 0.980 - ETA: 3s - loss: 0.0659 - acc: 0.985 - ETA: 3s - loss: 0.0610 - acc: 0.986 - ETA: 3s - loss: 0.0717 - acc: 0.983 - ETA: 3s - loss: 0.0627 - acc: 0.984 - ETA: 3s - loss: 0.0609 - acc: 0.983 - ETA: 3s - loss: 0.0543 - acc: 0.985 - ETA: 3s - loss: 0.0497 - acc: 0.987 - ETA: 3s - loss: 0.0459 - acc: 0.987 - ETA: 3s - loss: 0.0425 - acc: 0.988 - ETA: 3s - loss: 0.0394 - acc: 0.989 - ETA: 3s - loss: 0.0487 - acc: 0.988 - ETA: 2s - loss: 0.0536 - acc: 0.988 - ETA: 2s - loss: 0.0593 - acc: 0.988 - ETA: 2s - loss: 0.0574 - acc: 0.988 - ETA: 2s - loss: 0.0547 - acc: 0.988 - ETA: 2s - loss: 0.0518 - acc: 0.989 - ETA: 2s - loss: 0.0495 - acc: 0.990 - ETA: 2s - loss: 0.0482 - acc: 0.990 - ETA: 2s - loss: 0.0461 - acc: 0.990 - ETA: 2s - loss: 0.0442 - acc: 0.990 - ETA: 2s - loss: 0.0424 - acc: 0.991 - ETA: 2s - loss: 0.0408 - acc: 0.991 - ETA: 2s - loss: 0.0393 - acc: 0.991 - ETA: 2s - loss: 0.0438 - acc: 0.991 - ETA: 2s - loss: 0.0424 - acc: 0.992 - ETA: 2s - loss: 0.0418 - acc: 0.991 - ETA: 2s - loss: 0.0406 - acc: 0.992 - ETA: 2s - loss: 0.0450 - acc: 0.991 - ETA: 2s - loss: 0.0438 - acc: 0.991 - ETA: 1s - loss: 0.0425 - acc: 0.992 - ETA: 1s - loss: 0.0447 - acc: 0.991 - ETA: 1s - loss: 0.0435 - acc: 0.992 - ETA: 1s - loss: 0.0449 - acc: 0.991 - ETA: 1s - loss: 0.0450 - acc: 0.991 - ETA: 1s - loss: 0.0438 - acc: 0.991 - ETA: 1s - loss: 0.0458 - acc: 0.991 - ETA: 1s - loss: 0.0478 - acc: 0.991 - ETA: 1s - loss: 0.0467 - acc: 0.991 - ETA: 1s - loss: 0.0500 - acc: 0.991 - ETA: 1s - loss: 0.0493 - acc: 0.990 - ETA: 1s - loss: 0.0512 - acc: 0.990 - ETA: 1s - loss: 0.0504 - acc: 0.990 - ETA: 1s - loss: 0.0498 - acc: 0.990 - ETA: 1s - loss: 0.0495 - acc: 0.990 - ETA: 1s - loss: 0.0487 - acc: 0.990 - ETA: 1s - loss: 0.0478 - acc: 0.990 - ETA: 0s - loss: 0.0468 - acc: 0.991 - ETA: 0s - loss: 0.0462 - acc: 0.991 - ETA: 0s - loss: 0.0453 - acc: 0.991 - ETA: 0s - loss: 0.0445 - acc: 0.991 - ETA: 0s - loss: 0.0450 - acc: 0.991 - ETA: 0s - loss: 0.0459 - acc: 0.991 - ETA: 0s - loss: 0.0461 - acc: 0.991 - ETA: 0s - loss: 0.0454 - acc: 0.991 - ETA: 0s - loss: 0.0452 - acc: 0.991 - ETA: 0s - loss: 0.0445 - acc: 0.991 - ETA: 0s - loss: 0.0445 - acc: 0.991 - ETA: 0s - loss: 0.0453 - acc: 0.991 - ETA: 0s - loss: 0.0447 - acc: 0.991 - ETA: 0s - loss: 0.0440 - acc: 0.991 - ETA: 0s - loss: 0.0435 - acc: 0.991 - ETA: 0s - loss: 0.0429 - acc: 0.991 - ETA: 0s - loss: 0.0459 - acc: 0.991 - 4s 621us/step - loss: 0.0457 - acc: 0.9913 - val_loss: 1.1232 - val_acc: 0.8251 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0156 - acc: 0.990 - ETA: 3s - loss: 0.0245 - acc: 0.990 - ETA: 3s - loss: 0.0759 - acc: 0.986 - ETA: 3s - loss: 0.0573 - acc: 0.990 - ETA: 3s - loss: 0.0524 - acc: 0.990 - ETA: 3s - loss: 0.0450 - acc: 0.991 - ETA: 3s - loss: 0.0644 - acc: 0.990 - ETA: 3s - loss: 0.0767 - acc: 0.990 - ETA: 3s - loss: 0.0735 - acc: 0.988 - ETA: 3s - loss: 0.0662 - acc: 0.990 - ETA: 3s - loss: 0.0603 - acc: 0.990 - ETA: 3s - loss: 0.0698 - acc: 0.990 - ETA: 3s - loss: 0.0645 - acc: 0.990 - ETA: 3s - loss: 0.0630 - acc: 0.990 - ETA: 3s - loss: 0.0589 - acc: 0.991 - ETA: 3s - loss: 0.0552 - acc: 0.991 - ETA: 2s - loss: 0.0521 - acc: 0.992 - ETA: 2s - loss: 0.0493 - acc: 0.992 - ETA: 2s - loss: 0.0467 - acc: 0.993 - ETA: 2s - loss: 0.0444 - acc: 0.993 - ETA: 2s - loss: 0.0501 - acc: 0.993 - ETA: 2s - loss: 0.0512 - acc: 0.993 - ETA: 2s - loss: 0.0513 - acc: 0.993 - ETA: 2s - loss: 0.0492 - acc: 0.993 - ETA: 2s - loss: 0.0492 - acc: 0.993 - ETA: 2s - loss: 0.0509 - acc: 0.993 - ETA: 2s - loss: 0.0491 - acc: 0.993 - ETA: 2s - loss: 0.0483 - acc: 0.992 - ETA: 2s - loss: 0.0469 - acc: 0.992 - ETA: 2s - loss: 0.0496 - acc: 0.992 - ETA: 2s - loss: 0.0493 - acc: 0.992 - ETA: 2s - loss: 0.0480 - acc: 0.992 - ETA: 1s - loss: 0.0522 - acc: 0.992 - ETA: 1s - loss: 0.0511 - acc: 0.992 - ETA: 1s - loss: 0.0510 - acc: 0.991 - ETA: 1s - loss: 0.0496 - acc: 0.991 - ETA: 1s - loss: 0.0483 - acc: 0.992 - ETA: 1s - loss: 0.0477 - acc: 0.992 - ETA: 1s - loss: 0.0466 - acc: 0.992 - ETA: 1s - loss: 0.0455 - acc: 0.992 - ETA: 1s - loss: 0.0462 - acc: 0.992 - ETA: 1s - loss: 0.0455 - acc: 0.992 - ETA: 1s - loss: 0.0445 - acc: 0.992 - ETA: 1s - loss: 0.0437 - acc: 0.992 - ETA: 1s - loss: 0.0429 - acc: 0.992 - ETA: 1s - loss: 0.0422 - acc: 0.992 - ETA: 1s - loss: 0.0414 - acc: 0.993 - ETA: 1s - loss: 0.0412 - acc: 0.992 - ETA: 1s - loss: 0.0414 - acc: 0.992 - ETA: 0s - loss: 0.0407 - acc: 0.992 - ETA: 0s - loss: 0.0399 - acc: 0.992 - ETA: 0s - loss: 0.0394 - acc: 0.992 - ETA: 0s - loss: 0.0396 - acc: 0.992 - ETA: 0s - loss: 0.0389 - acc: 0.992 - ETA: 0s - loss: 0.0384 - acc: 0.992 - ETA: 0s - loss: 0.0377 - acc: 0.992 - ETA: 0s - loss: 0.0380 - acc: 0.992 - ETA: 0s - loss: 0.0413 - acc: 0.991 - ETA: 0s - loss: 0.0408 - acc: 0.991 - ETA: 0s - loss: 0.0402 - acc: 0.992 - ETA: 0s - loss: 0.0403 - acc: 0.991 - ETA: 0s - loss: 0.0400 - acc: 0.991 - ETA: 0s - loss: 0.0396 - acc: 0.991 - ETA: 0s - loss: 0.0402 - acc: 0.991 - ETA: 0s - loss: 0.0421 - acc: 0.991 - ETA: 0s - loss: 0.0425 - acc: 0.991 - 4s 625us/step - loss: 0.0435 - acc: 0.9910 - val_loss: 1.0962 - val_acc: 0.8395 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0073 - acc: 1.000 - ETA: 3s - loss: 0.0844 - acc: 0.995 - ETA: 3s - loss: 0.0573 - acc: 0.996 - ETA: 3s - loss: 0.0431 - acc: 0.997 - ETA: 3s - loss: 0.0393 - acc: 0.996 - ETA: 3s - loss: 0.0340 - acc: 0.996 - ETA: 3s - loss: 0.0293 - acc: 0.997 - ETA: 3s - loss: 0.0466 - acc: 0.996 - ETA: 3s - loss: 0.0684 - acc: 0.993 - ETA: 3s - loss: 0.0771 - acc: 0.993 - ETA: 3s - loss: 0.0905 - acc: 0.990 - ETA: 3s - loss: 0.0974 - acc: 0.990 - ETA: 3s - loss: 0.0899 - acc: 0.990 - ETA: 3s - loss: 0.0857 - acc: 0.990 - ETA: 3s - loss: 0.0800 - acc: 0.991 - ETA: 2s - loss: 0.0759 - acc: 0.991 - ETA: 2s - loss: 0.0829 - acc: 0.990 - ETA: 2s - loss: 0.0794 - acc: 0.990 - ETA: 2s - loss: 0.0755 - acc: 0.991 - ETA: 2s - loss: 0.0718 - acc: 0.991 - ETA: 2s - loss: 0.0692 - acc: 0.991 - ETA: 2s - loss: 0.0667 - acc: 0.991 - ETA: 2s - loss: 0.0646 - acc: 0.991 - ETA: 2s - loss: 0.0631 - acc: 0.991 - ETA: 2s - loss: 0.0606 - acc: 0.991 - ETA: 2s - loss: 0.0592 - acc: 0.991 - ETA: 2s - loss: 0.0577 - acc: 0.991 - ETA: 2s - loss: 0.0558 - acc: 0.991 - ETA: 2s - loss: 0.0548 - acc: 0.991 - ETA: 2s - loss: 0.0539 - acc: 0.991 - ETA: 2s - loss: 0.0522 - acc: 0.991 - ETA: 2s - loss: 0.0514 - acc: 0.991 - ETA: 1s - loss: 0.0499 - acc: 0.992 - ETA: 1s - loss: 0.0496 - acc: 0.991 - ETA: 1s - loss: 0.0482 - acc: 0.991 - ETA: 1s - loss: 0.0470 - acc: 0.991 - ETA: 1s - loss: 0.0462 - acc: 0.991 - ETA: 1s - loss: 0.0451 - acc: 0.992 - ETA: 1s - loss: 0.0445 - acc: 0.992 - ETA: 1s - loss: 0.0452 - acc: 0.991 - ETA: 1s - loss: 0.0442 - acc: 0.992 - ETA: 1s - loss: 0.0432 - acc: 0.992 - ETA: 1s - loss: 0.0427 - acc: 0.992 - ETA: 1s - loss: 0.0422 - acc: 0.992 - ETA: 1s - loss: 0.0460 - acc: 0.991 - ETA: 1s - loss: 0.0450 - acc: 0.991 - ETA: 1s - loss: 0.0444 - acc: 0.991 - ETA: 1s - loss: 0.0435 - acc: 0.991 - ETA: 1s - loss: 0.0439 - acc: 0.991 - ETA: 0s - loss: 0.0463 - acc: 0.991 - ETA: 0s - loss: 0.0458 - acc: 0.991 - ETA: 0s - loss: 0.0450 - acc: 0.991 - ETA: 0s - loss: 0.0452 - acc: 0.991 - ETA: 0s - loss: 0.0455 - acc: 0.991 - ETA: 0s - loss: 0.0450 - acc: 0.991 - ETA: 0s - loss: 0.0442 - acc: 0.991 - ETA: 0s - loss: 0.0435 - acc: 0.991 - ETA: 0s - loss: 0.0429 - acc: 0.991 - ETA: 0s - loss: 0.0427 - acc: 0.991 - ETA: 0s - loss: 0.0422 - acc: 0.991 - ETA: 0s - loss: 0.0417 - acc: 0.991 - ETA: 0s - loss: 0.0411 - acc: 0.991 - ETA: 0s - loss: 0.0417 - acc: 0.991 - ETA: 0s - loss: 0.0411 - acc: 0.991 - ETA: 0s - loss: 0.0407 - acc: 0.991 - ETA: 0s - loss: 0.0401 - acc: 0.991 - 4s 622us/step - loss: 0.0396 - acc: 0.9918 - val_loss: 1.0096 - val_acc: 0.8443 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0010 - acc: 1.000 - ETA: 3s - loss: 5.4855e-04 - acc: 1.000 - ETA: 3s - loss: 8.0852e-04 - acc: 1.000 - ETA: 3s - loss: 0.0231 - acc: 0.9975 - ETA: 3s - loss: 0.0498 - acc: 0.994 - ETA: 3s - loss: 0.0415 - acc: 0.995 - ETA: 3s - loss: 0.0357 - acc: 0.995 - ETA: 3s - loss: 0.0313 - acc: 0.996 - ETA: 3s - loss: 0.0279 - acc: 0.996 - ETA: 3s - loss: 0.0422 - acc: 0.995 - ETA: 3s - loss: 0.0646 - acc: 0.992 - ETA: 3s - loss: 0.0592 - acc: 0.993 - ETA: 3s - loss: 0.0546 - acc: 0.993 - ETA: 3s - loss: 0.0508 - acc: 0.994 - ETA: 2s - loss: 0.0476 - acc: 0.994 - ETA: 2s - loss: 0.0446 - acc: 0.995 - ETA: 2s - loss: 0.0420 - acc: 0.995 - ETA: 2s - loss: 0.0414 - acc: 0.995 - ETA: 2s - loss: 0.0398 - acc: 0.995 - ETA: 2s - loss: 0.0424 - acc: 0.995 - ETA: 2s - loss: 0.0432 - acc: 0.994 - ETA: 2s - loss: 0.0412 - acc: 0.995 - ETA: 2s - loss: 0.0402 - acc: 0.994 - ETA: 2s - loss: 0.0388 - acc: 0.995 - ETA: 2s - loss: 0.0373 - acc: 0.995 - ETA: 2s - loss: 0.0359 - acc: 0.995 - ETA: 2s - loss: 0.0393 - acc: 0.995 - ETA: 2s - loss: 0.0385 - acc: 0.995 - ETA: 2s - loss: 0.0411 - acc: 0.994 - ETA: 2s - loss: 0.0402 - acc: 0.994 - ETA: 2s - loss: 0.0389 - acc: 0.994 - ETA: 2s - loss: 0.0377 - acc: 0.994 - ETA: 1s - loss: 0.0366 - acc: 0.994 - ETA: 1s - loss: 0.0356 - acc: 0.995 - ETA: 1s - loss: 0.0347 - acc: 0.995 - ETA: 1s - loss: 0.0338 - acc: 0.995 - ETA: 1s - loss: 0.0341 - acc: 0.994 - ETA: 1s - loss: 0.0332 - acc: 0.995 - ETA: 1s - loss: 0.0323 - acc: 0.995 - ETA: 1s - loss: 0.0326 - acc: 0.995 - ETA: 1s - loss: 0.0318 - acc: 0.995 - ETA: 1s - loss: 0.0313 - acc: 0.995 - ETA: 1s - loss: 0.0326 - acc: 0.994 - ETA: 1s - loss: 0.0318 - acc: 0.994 - ETA: 1s - loss: 0.0316 - acc: 0.994 - ETA: 1s - loss: 0.0310 - acc: 0.994 - ETA: 1s - loss: 0.0315 - acc: 0.994 - ETA: 1s - loss: 0.0344 - acc: 0.994 - ETA: 1s - loss: 0.0338 - acc: 0.994 - ETA: 0s - loss: 0.0332 - acc: 0.994 - ETA: 0s - loss: 0.0327 - acc: 0.994 - ETA: 0s - loss: 0.0339 - acc: 0.994 - ETA: 0s - loss: 0.0350 - acc: 0.993 - ETA: 0s - loss: 0.0351 - acc: 0.993 - ETA: 0s - loss: 0.0363 - acc: 0.993 - ETA: 0s - loss: 0.0416 - acc: 0.992 - ETA: 0s - loss: 0.0437 - acc: 0.992 - ETA: 0s - loss: 0.0431 - acc: 0.992 - ETA: 0s - loss: 0.0424 - acc: 0.992 - ETA: 0s - loss: 0.0417 - acc: 0.993 - ETA: 0s - loss: 0.0411 - acc: 0.993 - ETA: 0s - loss: 0.0425 - acc: 0.992 - ETA: 0s - loss: 0.0420 - acc: 0.992 - ETA: 0s - loss: 0.0413 - acc: 0.993 - ETA: 0s - loss: 0.0433 - acc: 0.992 - ETA: 0s - loss: 0.0426 - acc: 0.993 - 4s 621us/step - loss: 0.0422 - acc: 0.9931 - val_loss: 1.0979 - val_acc: 0.8431 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 4s - loss: 5.8675e-04 - acc: 1.000 - ETA: 3s - loss: 4.8386e-04 - acc: 1.000 - ETA: 3s - loss: 0.0022 - acc: 1.0000 - ETA: 3s - loss: 0.0017 - acc: 1.000 - ETA: 3s - loss: 0.0251 - acc: 0.998 - ETA: 3s - loss: 0.0237 - acc: 0.996 - ETA: 3s - loss: 0.0233 - acc: 0.995 - ETA: 3s - loss: 0.0406 - acc: 0.995 - ETA: 3s - loss: 0.0362 - acc: 0.995 - ETA: 3s - loss: 0.0332 - acc: 0.996 - ETA: 3s - loss: 0.0303 - acc: 0.996 - ETA: 3s - loss: 0.0278 - acc: 0.996 - ETA: 3s - loss: 0.0258 - acc: 0.996 - ETA: 3s - loss: 0.0257 - acc: 0.996 - ETA: 3s - loss: 0.0242 - acc: 0.996 - ETA: 3s - loss: 0.0230 - acc: 0.996 - ETA: 2s - loss: 0.0217 - acc: 0.997 - ETA: 2s - loss: 0.0268 - acc: 0.995 - ETA: 2s - loss: 0.0299 - acc: 0.995 - ETA: 2s - loss: 0.0314 - acc: 0.994 - ETA: 2s - loss: 0.0299 - acc: 0.994 - ETA: 2s - loss: 0.0366 - acc: 0.994 - ETA: 2s - loss: 0.0367 - acc: 0.993 - ETA: 2s - loss: 0.0352 - acc: 0.994 - ETA: 2s - loss: 0.0338 - acc: 0.994 - ETA: 2s - loss: 0.0325 - acc: 0.994 - ETA: 2s - loss: 0.0313 - acc: 0.994 - ETA: 2s - loss: 0.0360 - acc: 0.994 - ETA: 2s - loss: 0.0348 - acc: 0.994 - ETA: 2s - loss: 0.0337 - acc: 0.995 - ETA: 2s - loss: 0.0327 - acc: 0.995 - ETA: 2s - loss: 0.0339 - acc: 0.995 - ETA: 1s - loss: 0.0329 - acc: 0.995 - ETA: 1s - loss: 0.0319 - acc: 0.995 - ETA: 1s - loss: 0.0310 - acc: 0.995 - ETA: 1s - loss: 0.0302 - acc: 0.995 - ETA: 1s - loss: 0.0303 - acc: 0.995 - ETA: 1s - loss: 0.0297 - acc: 0.995 - ETA: 1s - loss: 0.0290 - acc: 0.995 - ETA: 1s - loss: 0.0294 - acc: 0.995 - ETA: 1s - loss: 0.0313 - acc: 0.994 - ETA: 1s - loss: 0.0307 - acc: 0.995 - ETA: 1s - loss: 0.0303 - acc: 0.994 - ETA: 1s - loss: 0.0302 - acc: 0.994 - ETA: 1s - loss: 0.0295 - acc: 0.994 - ETA: 1s - loss: 0.0324 - acc: 0.994 - ETA: 1s - loss: 0.0317 - acc: 0.994 - ETA: 1s - loss: 0.0311 - acc: 0.995 - ETA: 1s - loss: 0.0304 - acc: 0.995 - ETA: 0s - loss: 0.0302 - acc: 0.995 - ETA: 0s - loss: 0.0296 - acc: 0.995 - ETA: 0s - loss: 0.0290 - acc: 0.995 - ETA: 0s - loss: 0.0285 - acc: 0.995 - ETA: 0s - loss: 0.0299 - acc: 0.995 - ETA: 0s - loss: 0.0301 - acc: 0.994 - ETA: 0s - loss: 0.0296 - acc: 0.995 - ETA: 0s - loss: 0.0291 - acc: 0.995 - ETA: 0s - loss: 0.0286 - acc: 0.995 - ETA: 0s - loss: 0.0296 - acc: 0.995 - ETA: 0s - loss: 0.0291 - acc: 0.995 - ETA: 0s - loss: 0.0297 - acc: 0.994 - ETA: 0s - loss: 0.0292 - acc: 0.995 - ETA: 0s - loss: 0.0314 - acc: 0.994 - ETA: 0s - loss: 0.0309 - acc: 0.995 - ETA: 0s - loss: 0.0305 - acc: 0.995 - ETA: 0s - loss: 0.0301 - acc: 0.995 - 4s 626us/step - loss: 0.0298 - acc: 0.9952 - val_loss: 1.2179 - val_acc: 0.8311 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0124 - acc: 0.990 - ETA: 3s - loss: 0.0062 - acc: 0.995 - ETA: 3s - loss: 0.0043 - acc: 0.996 - ETA: 3s - loss: 0.0032 - acc: 0.997 - ETA: 3s - loss: 0.0073 - acc: 0.996 - ETA: 3s - loss: 0.0061 - acc: 0.996 - ETA: 3s - loss: 0.0284 - acc: 0.995 - ETA: 3s - loss: 0.0250 - acc: 0.996 - ETA: 3s - loss: 0.0223 - acc: 0.996 - ETA: 3s - loss: 0.0202 - acc: 0.997 - ETA: 3s - loss: 0.0184 - acc: 0.997 - ETA: 3s - loss: 0.0170 - acc: 0.997 - ETA: 3s - loss: 0.0160 - acc: 0.997 - ETA: 3s - loss: 0.0149 - acc: 0.997 - ETA: 3s - loss: 0.0263 - acc: 0.996 - ETA: 2s - loss: 0.0249 - acc: 0.996 - ETA: 2s - loss: 0.0234 - acc: 0.997 - ETA: 2s - loss: 0.0302 - acc: 0.996 - ETA: 2s - loss: 0.0286 - acc: 0.996 - ETA: 2s - loss: 0.0322 - acc: 0.996 - ETA: 2s - loss: 0.0384 - acc: 0.995 - ETA: 2s - loss: 0.0366 - acc: 0.995 - ETA: 2s - loss: 0.0354 - acc: 0.996 - ETA: 2s - loss: 0.0350 - acc: 0.995 - ETA: 2s - loss: 0.0344 - acc: 0.995 - ETA: 2s - loss: 0.0332 - acc: 0.995 - ETA: 2s - loss: 0.0352 - acc: 0.995 - ETA: 2s - loss: 0.0339 - acc: 0.995 - ETA: 2s - loss: 0.0331 - acc: 0.995 - ETA: 2s - loss: 0.0320 - acc: 0.995 - ETA: 2s - loss: 0.0346 - acc: 0.995 - ETA: 2s - loss: 0.0371 - acc: 0.995 - ETA: 1s - loss: 0.0361 - acc: 0.995 - ETA: 1s - loss: 0.0398 - acc: 0.995 - ETA: 1s - loss: 0.0418 - acc: 0.994 - ETA: 1s - loss: 0.0423 - acc: 0.994 - ETA: 1s - loss: 0.0418 - acc: 0.994 - ETA: 1s - loss: 0.0425 - acc: 0.994 - ETA: 1s - loss: 0.0415 - acc: 0.994 - ETA: 1s - loss: 0.0405 - acc: 0.994 - ETA: 1s - loss: 0.0395 - acc: 0.994 - ETA: 1s - loss: 0.0392 - acc: 0.994 - ETA: 1s - loss: 0.0399 - acc: 0.994 - ETA: 1s - loss: 0.0400 - acc: 0.994 - ETA: 1s - loss: 0.0397 - acc: 0.994 - ETA: 1s - loss: 0.0389 - acc: 0.994 - ETA: 1s - loss: 0.0381 - acc: 0.994 - ETA: 1s - loss: 0.0374 - acc: 0.994 - ETA: 1s - loss: 0.0366 - acc: 0.994 - ETA: 0s - loss: 0.0359 - acc: 0.995 - ETA: 0s - loss: 0.0352 - acc: 0.995 - ETA: 0s - loss: 0.0357 - acc: 0.995 - ETA: 0s - loss: 0.0353 - acc: 0.994 - ETA: 0s - loss: 0.0346 - acc: 0.995 - ETA: 0s - loss: 0.0346 - acc: 0.994 - ETA: 0s - loss: 0.0340 - acc: 0.995 - ETA: 0s - loss: 0.0335 - acc: 0.995 - ETA: 0s - loss: 0.0329 - acc: 0.995 - ETA: 0s - loss: 0.0324 - acc: 0.995 - ETA: 0s - loss: 0.0318 - acc: 0.995 - ETA: 0s - loss: 0.0313 - acc: 0.995 - ETA: 0s - loss: 0.0334 - acc: 0.995 - ETA: 0s - loss: 0.0329 - acc: 0.995 - ETA: 0s - loss: 0.0324 - acc: 0.995 - ETA: 0s - loss: 0.0320 - acc: 0.995 - ETA: 0s - loss: 0.0315 - acc: 0.995 - 4s 623us/step - loss: 0.0315 - acc: 0.9955 - val_loss: 1.2285 - val_acc: 0.8491 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 3s - loss: 0.3206 - acc: 0.980 - ETA: 3s - loss: 0.1604 - acc: 0.990 - ETA: 3s - loss: 0.1069 - acc: 0.993 - ETA: 3s - loss: 0.0802 - acc: 0.995 - ETA: 3s - loss: 0.0642 - acc: 0.996 - ETA: 3s - loss: 0.0535 - acc: 0.996 - ETA: 3s - loss: 0.0459 - acc: 0.997 - ETA: 3s - loss: 0.0555 - acc: 0.995 - ETA: 3s - loss: 0.0594 - acc: 0.994 - ETA: 3s - loss: 0.0535 - acc: 0.995 - ETA: 3s - loss: 0.0487 - acc: 0.995 - ETA: 3s - loss: 0.0448 - acc: 0.995 - ETA: 3s - loss: 0.0416 - acc: 0.996 - ETA: 3s - loss: 0.0468 - acc: 0.995 - ETA: 3s - loss: 0.0437 - acc: 0.996 - ETA: 2s - loss: 0.0445 - acc: 0.995 - ETA: 2s - loss: 0.0425 - acc: 0.995 - ETA: 2s - loss: 0.0402 - acc: 0.995 - ETA: 2s - loss: 0.0387 - acc: 0.995 - ETA: 2s - loss: 0.0404 - acc: 0.994 - ETA: 2s - loss: 0.0474 - acc: 0.993 - ETA: 2s - loss: 0.0453 - acc: 0.993 - ETA: 2s - loss: 0.0441 - acc: 0.993 - ETA: 2s - loss: 0.0423 - acc: 0.993 - ETA: 2s - loss: 0.0406 - acc: 0.994 - ETA: 2s - loss: 0.0392 - acc: 0.994 - ETA: 2s - loss: 0.0377 - acc: 0.994 - ETA: 2s - loss: 0.0364 - acc: 0.994 - ETA: 2s - loss: 0.0352 - acc: 0.994 - ETA: 2s - loss: 0.0373 - acc: 0.994 - ETA: 2s - loss: 0.0364 - acc: 0.994 - ETA: 2s - loss: 0.0371 - acc: 0.994 - ETA: 1s - loss: 0.0361 - acc: 0.994 - ETA: 1s - loss: 0.0350 - acc: 0.994 - ETA: 1s - loss: 0.0340 - acc: 0.994 - ETA: 1s - loss: 0.0334 - acc: 0.994 - ETA: 1s - loss: 0.0325 - acc: 0.994 - ETA: 1s - loss: 0.0316 - acc: 0.995 - ETA: 1s - loss: 0.0308 - acc: 0.995 - ETA: 1s - loss: 0.0300 - acc: 0.995 - ETA: 1s - loss: 0.0294 - acc: 0.995 - ETA: 1s - loss: 0.0293 - acc: 0.995 - ETA: 1s - loss: 0.0288 - acc: 0.995 - ETA: 1s - loss: 0.0309 - acc: 0.995 - ETA: 1s - loss: 0.0302 - acc: 0.995 - ETA: 1s - loss: 0.0296 - acc: 0.995 - ETA: 1s - loss: 0.0290 - acc: 0.995 - ETA: 1s - loss: 0.0285 - acc: 0.995 - ETA: 1s - loss: 0.0279 - acc: 0.995 - ETA: 0s - loss: 0.0274 - acc: 0.995 - ETA: 0s - loss: 0.0269 - acc: 0.995 - ETA: 0s - loss: 0.0293 - acc: 0.995 - ETA: 0s - loss: 0.0289 - acc: 0.995 - ETA: 0s - loss: 0.0294 - acc: 0.995 - ETA: 0s - loss: 0.0318 - acc: 0.995 - ETA: 0s - loss: 0.0312 - acc: 0.995 - ETA: 0s - loss: 0.0311 - acc: 0.995 - ETA: 0s - loss: 0.0312 - acc: 0.995 - ETA: 0s - loss: 0.0308 - acc: 0.995 - ETA: 0s - loss: 0.0303 - acc: 0.995 - ETA: 0s - loss: 0.0326 - acc: 0.995 - ETA: 0s - loss: 0.0321 - acc: 0.995 - ETA: 0s - loss: 0.0316 - acc: 0.995 - ETA: 0s - loss: 0.0317 - acc: 0.995 - ETA: 0s - loss: 0.0312 - acc: 0.995 - ETA: 0s - loss: 0.0308 - acc: 0.995 - 4s 622us/step - loss: 0.0313 - acc: 0.9955 - val_loss: 1.2805 - val_acc: 0.8395 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0638 - acc: 0.990 - ETA: 3s - loss: 0.0390 - acc: 0.990 - ETA: 3s - loss: 0.0291 - acc: 0.990 - ETA: 3s - loss: 0.0445 - acc: 0.990 - ETA: 3s - loss: 0.0356 - acc: 0.992 - ETA: 3s - loss: 0.0297 - acc: 0.993 - ETA: 3s - loss: 0.0332 - acc: 0.992 - ETA: 3s - loss: 0.0292 - acc: 0.993 - ETA: 3s - loss: 0.0260 - acc: 0.994 - ETA: 3s - loss: 0.0234 - acc: 0.995 - ETA: 3s - loss: 0.0213 - acc: 0.995 - ETA: 3s - loss: 0.0195 - acc: 0.995 - ETA: 3s - loss: 0.0276 - acc: 0.994 - ETA: 3s - loss: 0.0257 - acc: 0.995 - ETA: 3s - loss: 0.0349 - acc: 0.994 - ETA: 2s - loss: 0.0433 - acc: 0.994 - ETA: 2s - loss: 0.0409 - acc: 0.994 - ETA: 2s - loss: 0.0387 - acc: 0.995 - ETA: 2s - loss: 0.0366 - acc: 0.995 - ETA: 2s - loss: 0.0405 - acc: 0.995 - ETA: 2s - loss: 0.0386 - acc: 0.995 - ETA: 2s - loss: 0.0369 - acc: 0.995 - ETA: 2s - loss: 0.0354 - acc: 0.995 - ETA: 2s - loss: 0.0339 - acc: 0.995 - ETA: 2s - loss: 0.0355 - acc: 0.995 - ETA: 2s - loss: 0.0346 - acc: 0.995 - ETA: 2s - loss: 0.0333 - acc: 0.995 - ETA: 2s - loss: 0.0321 - acc: 0.995 - ETA: 2s - loss: 0.0311 - acc: 0.995 - ETA: 2s - loss: 0.0302 - acc: 0.995 - ETA: 2s - loss: 0.0316 - acc: 0.995 - ETA: 2s - loss: 0.0306 - acc: 0.995 - ETA: 1s - loss: 0.0297 - acc: 0.995 - ETA: 1s - loss: 0.0288 - acc: 0.995 - ETA: 1s - loss: 0.0280 - acc: 0.996 - ETA: 1s - loss: 0.0273 - acc: 0.996 - ETA: 1s - loss: 0.0275 - acc: 0.995 - ETA: 1s - loss: 0.0268 - acc: 0.996 - ETA: 1s - loss: 0.0261 - acc: 0.996 - ETA: 1s - loss: 0.0259 - acc: 0.996 - ETA: 1s - loss: 0.0253 - acc: 0.996 - ETA: 1s - loss: 0.0247 - acc: 0.996 - ETA: 1s - loss: 0.0242 - acc: 0.996 - ETA: 1s - loss: 0.0259 - acc: 0.995 - ETA: 1s - loss: 0.0265 - acc: 0.995 - ETA: 1s - loss: 0.0308 - acc: 0.995 - ETA: 1s - loss: 0.0301 - acc: 0.995 - ETA: 1s - loss: 0.0295 - acc: 0.995 - ETA: 1s - loss: 0.0289 - acc: 0.995 - ETA: 0s - loss: 0.0283 - acc: 0.995 - ETA: 0s - loss: 0.0278 - acc: 0.995 - ETA: 0s - loss: 0.0312 - acc: 0.995 - ETA: 0s - loss: 0.0307 - acc: 0.995 - ETA: 0s - loss: 0.0301 - acc: 0.995 - ETA: 0s - loss: 0.0305 - acc: 0.995 - ETA: 0s - loss: 0.0300 - acc: 0.995 - ETA: 0s - loss: 0.0295 - acc: 0.995 - ETA: 0s - loss: 0.0299 - acc: 0.995 - ETA: 0s - loss: 0.0294 - acc: 0.995 - ETA: 0s - loss: 0.0289 - acc: 0.995 - ETA: 0s - loss: 0.0284 - acc: 0.995 - ETA: 0s - loss: 0.0294 - acc: 0.995 - ETA: 0s - loss: 0.0289 - acc: 0.995 - ETA: 0s - loss: 0.0286 - acc: 0.995 - ETA: 0s - loss: 0.0282 - acc: 0.995 - ETA: 0s - loss: 0.0302 - acc: 0.995 - 4s 626us/step - loss: 0.0301 - acc: 0.9954 - val_loss: 1.1560 - val_acc: 0.8539 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0015 - acc: 1.000 - ETA: 3s - loss: 0.0043 - acc: 0.995 - ETA: 3s - loss: 0.0031 - acc: 0.996 - ETA: 3s - loss: 0.0031 - acc: 0.997 - ETA: 3s - loss: 0.0045 - acc: 0.996 - ETA: 3s - loss: 0.0038 - acc: 0.996 - ETA: 3s - loss: 0.0033 - acc: 0.997 - ETA: 3s - loss: 0.0030 - acc: 0.997 - ETA: 3s - loss: 0.0149 - acc: 0.996 - ETA: 3s - loss: 0.0135 - acc: 0.997 - ETA: 3s - loss: 0.0197 - acc: 0.996 - ETA: 3s - loss: 0.0181 - acc: 0.996 - ETA: 3s - loss: 0.0230 - acc: 0.996 - ETA: 3s - loss: 0.0214 - acc: 0.996 - ETA: 3s - loss: 0.0200 - acc: 0.996 - ETA: 2s - loss: 0.0187 - acc: 0.996 - ETA: 2s - loss: 0.0176 - acc: 0.997 - ETA: 2s - loss: 0.0167 - acc: 0.997 - ETA: 2s - loss: 0.0158 - acc: 0.997 - ETA: 2s - loss: 0.0150 - acc: 0.997 - ETA: 2s - loss: 0.0143 - acc: 0.997 - ETA: 2s - loss: 0.0136 - acc: 0.997 - ETA: 2s - loss: 0.0131 - acc: 0.997 - ETA: 2s - loss: 0.0125 - acc: 0.997 - ETA: 2s - loss: 0.0121 - acc: 0.998 - ETA: 2s - loss: 0.0173 - acc: 0.997 - ETA: 2s - loss: 0.0167 - acc: 0.997 - ETA: 2s - loss: 0.0161 - acc: 0.997 - ETA: 2s - loss: 0.0156 - acc: 0.997 - ETA: 2s - loss: 0.0208 - acc: 0.997 - ETA: 2s - loss: 0.0201 - acc: 0.997 - ETA: 2s - loss: 0.0195 - acc: 0.997 - ETA: 1s - loss: 0.0190 - acc: 0.997 - ETA: 1s - loss: 0.0216 - acc: 0.997 - ETA: 1s - loss: 0.0218 - acc: 0.997 - ETA: 1s - loss: 0.0212 - acc: 0.997 - ETA: 1s - loss: 0.0206 - acc: 0.997 - ETA: 1s - loss: 0.0201 - acc: 0.997 - ETA: 1s - loss: 0.0196 - acc: 0.997 - ETA: 1s - loss: 0.0191 - acc: 0.997 - ETA: 1s - loss: 0.0189 - acc: 0.997 - ETA: 1s - loss: 0.0263 - acc: 0.996 - ETA: 1s - loss: 0.0257 - acc: 0.997 - ETA: 1s - loss: 0.0252 - acc: 0.997 - ETA: 1s - loss: 0.0246 - acc: 0.997 - ETA: 1s - loss: 0.0241 - acc: 0.997 - ETA: 1s - loss: 0.0236 - acc: 0.997 - ETA: 1s - loss: 0.0231 - acc: 0.997 - ETA: 1s - loss: 0.0263 - acc: 0.996 - ETA: 0s - loss: 0.0257 - acc: 0.997 - ETA: 0s - loss: 0.0252 - acc: 0.997 - ETA: 0s - loss: 0.0248 - acc: 0.997 - ETA: 0s - loss: 0.0243 - acc: 0.997 - ETA: 0s - loss: 0.0238 - acc: 0.997 - ETA: 0s - loss: 0.0234 - acc: 0.997 - ETA: 0s - loss: 0.0230 - acc: 0.997 - ETA: 0s - loss: 0.0234 - acc: 0.997 - ETA: 0s - loss: 0.0253 - acc: 0.996 - ETA: 0s - loss: 0.0249 - acc: 0.996 - ETA: 0s - loss: 0.0259 - acc: 0.996 - ETA: 0s - loss: 0.0255 - acc: 0.996 - ETA: 0s - loss: 0.0251 - acc: 0.996 - ETA: 0s - loss: 0.0252 - acc: 0.996 - ETA: 0s - loss: 0.0248 - acc: 0.996 - ETA: 0s - loss: 0.0244 - acc: 0.996 - ETA: 0s - loss: 0.0242 - acc: 0.996 - 4s 623us/step - loss: 0.0239 - acc: 0.9967 - val_loss: 1.2130 - val_acc: 0.8479 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 3s - loss: 5.0557e-05 - acc: 1.000 - ETA: 3s - loss: 4.2567e-04 - acc: 1.000 - ETA: 3s - loss: 0.0541 - acc: 0.9967 - ETA: 3s - loss: 0.0406 - acc: 0.997 - ETA: 3s - loss: 0.0389 - acc: 0.996 - ETA: 3s - loss: 0.0356 - acc: 0.995 - ETA: 3s - loss: 0.0536 - acc: 0.994 - ETA: 3s - loss: 0.0485 - acc: 0.993 - ETA: 3s - loss: 0.0437 - acc: 0.994 - ETA: 3s - loss: 0.0393 - acc: 0.995 - ETA: 3s - loss: 0.0358 - acc: 0.995 - ETA: 3s - loss: 0.0328 - acc: 0.995 - ETA: 3s - loss: 0.0304 - acc: 0.996 - ETA: 3s - loss: 0.0282 - acc: 0.996 - ETA: 3s - loss: 0.0265 - acc: 0.996 - ETA: 2s - loss: 0.0249 - acc: 0.996 - ETA: 2s - loss: 0.0234 - acc: 0.997 - ETA: 2s - loss: 0.0238 - acc: 0.996 - ETA: 2s - loss: 0.0225 - acc: 0.996 - ETA: 2s - loss: 0.0214 - acc: 0.997 - ETA: 2s - loss: 0.0204 - acc: 0.997 - ETA: 2s - loss: 0.0195 - acc: 0.997 - ETA: 2s - loss: 0.0186 - acc: 0.997 - ETA: 2s - loss: 0.0179 - acc: 0.997 - ETA: 2s - loss: 0.0172 - acc: 0.997 - ETA: 2s - loss: 0.0227 - acc: 0.997 - ETA: 2s - loss: 0.0219 - acc: 0.997 - ETA: 2s - loss: 0.0211 - acc: 0.997 - ETA: 2s - loss: 0.0215 - acc: 0.997 - ETA: 2s - loss: 0.0208 - acc: 0.997 - ETA: 2s - loss: 0.0202 - acc: 0.997 - ETA: 2s - loss: 0.0246 - acc: 0.997 - ETA: 1s - loss: 0.0239 - acc: 0.997 - ETA: 1s - loss: 0.0232 - acc: 0.997 - ETA: 1s - loss: 0.0225 - acc: 0.997 - ETA: 1s - loss: 0.0219 - acc: 0.997 - ETA: 1s - loss: 0.0213 - acc: 0.997 - ETA: 1s - loss: 0.0207 - acc: 0.997 - ETA: 1s - loss: 0.0215 - acc: 0.997 - ETA: 1s - loss: 0.0211 - acc: 0.997 - ETA: 1s - loss: 0.0209 - acc: 0.997 - ETA: 1s - loss: 0.0209 - acc: 0.997 - ETA: 1s - loss: 0.0204 - acc: 0.997 - ETA: 1s - loss: 0.0207 - acc: 0.997 - ETA: 1s - loss: 0.0203 - acc: 0.997 - ETA: 1s - loss: 0.0198 - acc: 0.997 - ETA: 1s - loss: 0.0194 - acc: 0.997 - ETA: 1s - loss: 0.0202 - acc: 0.997 - ETA: 1s - loss: 0.0228 - acc: 0.996 - ETA: 0s - loss: 0.0262 - acc: 0.996 - ETA: 0s - loss: 0.0257 - acc: 0.996 - ETA: 0s - loss: 0.0252 - acc: 0.996 - ETA: 0s - loss: 0.0248 - acc: 0.996 - ETA: 0s - loss: 0.0243 - acc: 0.996 - ETA: 0s - loss: 0.0239 - acc: 0.996 - ETA: 0s - loss: 0.0235 - acc: 0.996 - ETA: 0s - loss: 0.0237 - acc: 0.996 - ETA: 0s - loss: 0.0233 - acc: 0.996 - ETA: 0s - loss: 0.0242 - acc: 0.996 - ETA: 0s - loss: 0.0241 - acc: 0.996 - ETA: 0s - loss: 0.0237 - acc: 0.996 - ETA: 0s - loss: 0.0233 - acc: 0.996 - ETA: 0s - loss: 0.0246 - acc: 0.996 - ETA: 0s - loss: 0.0242 - acc: 0.996 - ETA: 0s - loss: 0.0238 - acc: 0.996 - ETA: 0s - loss: 0.0238 - acc: 0.996 - 4s 635us/step - loss: 0.0254 - acc: 0.9964 - val_loss: 1.2485 - val_acc: 0.8371 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 3s - loss: 1.1371e-04 - acc: 1.000 - ETA: 3s - loss: 3.9015e-04 - acc: 1.000 - ETA: 3s - loss: 0.0701 - acc: 0.9933 - ETA: 3s - loss: 0.0526 - acc: 0.995 - ETA: 3s - loss: 0.0423 - acc: 0.996 - ETA: 3s - loss: 0.0352 - acc: 0.996 - ETA: 3s - loss: 0.0302 - acc: 0.997 - ETA: 3s - loss: 0.0264 - acc: 0.997 - ETA: 3s - loss: 0.0253 - acc: 0.996 - ETA: 3s - loss: 0.0228 - acc: 0.997 - ETA: 3s - loss: 0.0209 - acc: 0.997 - ETA: 3s - loss: 0.0192 - acc: 0.997 - ETA: 3s - loss: 0.0177 - acc: 0.997 - ETA: 3s - loss: 0.0165 - acc: 0.997 - ETA: 3s - loss: 0.0157 - acc: 0.998 - ETA: 2s - loss: 0.0168 - acc: 0.997 - ETA: 2s - loss: 0.0207 - acc: 0.996 - ETA: 2s - loss: 0.0196 - acc: 0.996 - ETA: 2s - loss: 0.0185 - acc: 0.996 - ETA: 2s - loss: 0.0176 - acc: 0.997 - ETA: 2s - loss: 0.0168 - acc: 0.997 - ETA: 2s - loss: 0.0160 - acc: 0.997 - ETA: 2s - loss: 0.0159 - acc: 0.997 - ETA: 2s - loss: 0.0153 - acc: 0.997 - ETA: 2s - loss: 0.0147 - acc: 0.997 - ETA: 2s - loss: 0.0142 - acc: 0.997 - ETA: 2s - loss: 0.0250 - acc: 0.996 - ETA: 2s - loss: 0.0242 - acc: 0.996 - ETA: 2s - loss: 0.0233 - acc: 0.996 - ETA: 2s - loss: 0.0226 - acc: 0.996 - ETA: 2s - loss: 0.0219 - acc: 0.996 - ETA: 2s - loss: 0.0213 - acc: 0.996 - ETA: 1s - loss: 0.0206 - acc: 0.997 - ETA: 1s - loss: 0.0248 - acc: 0.996 - ETA: 1s - loss: 0.0241 - acc: 0.996 - ETA: 1s - loss: 0.0234 - acc: 0.996 - ETA: 1s - loss: 0.0228 - acc: 0.997 - ETA: 1s - loss: 0.0264 - acc: 0.996 - ETA: 1s - loss: 0.0299 - acc: 0.996 - ETA: 1s - loss: 0.0291 - acc: 0.996 - ETA: 1s - loss: 0.0293 - acc: 0.996 - ETA: 1s - loss: 0.0286 - acc: 0.996 - ETA: 1s - loss: 0.0280 - acc: 0.996 - ETA: 1s - loss: 0.0273 - acc: 0.996 - ETA: 1s - loss: 0.0268 - acc: 0.996 - ETA: 1s - loss: 0.0262 - acc: 0.997 - ETA: 1s - loss: 0.0258 - acc: 0.997 - ETA: 1s - loss: 0.0252 - acc: 0.997 - ETA: 1s - loss: 0.0247 - acc: 0.997 - ETA: 0s - loss: 0.0242 - acc: 0.997 - ETA: 0s - loss: 0.0277 - acc: 0.996 - ETA: 0s - loss: 0.0271 - acc: 0.996 - ETA: 0s - loss: 0.0266 - acc: 0.997 - ETA: 0s - loss: 0.0261 - acc: 0.997 - ETA: 0s - loss: 0.0276 - acc: 0.996 - ETA: 0s - loss: 0.0271 - acc: 0.996 - ETA: 0s - loss: 0.0266 - acc: 0.996 - ETA: 0s - loss: 0.0262 - acc: 0.996 - ETA: 0s - loss: 0.0266 - acc: 0.996 - ETA: 0s - loss: 0.0262 - acc: 0.996 - ETA: 0s - loss: 0.0257 - acc: 0.996 - ETA: 0s - loss: 0.0253 - acc: 0.996 - ETA: 0s - loss: 0.0250 - acc: 0.997 - ETA: 0s - loss: 0.0246 - acc: 0.997 - ETA: 0s - loss: 0.0243 - acc: 0.997 - ETA: 0s - loss: 0.0239 - acc: 0.997 - 4s 625us/step - loss: 0.0236 - acc: 0.9972 - val_loss: 1.1698 - val_acc: 0.8503 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 3s - loss: 0.0015 - acc: 1.000 - ETA: 3s - loss: 7.7382e-04 - acc: 1.000 - ETA: 3s - loss: 5.3915e-04 - acc: 1.000 - ETA: 3s - loss: 4.0773e-04 - acc: 1.000 - ETA: 3s - loss: 3.2649e-04 - acc: 1.000 - ETA: 3s - loss: 3.0896e-04 - acc: 1.000 - ETA: 3s - loss: 0.0233 - acc: 0.9986 - ETA: 3s - loss: 0.0204 - acc: 0.998 - ETA: 3s - loss: 0.0181 - acc: 0.998 - ETA: 3s - loss: 0.0324 - acc: 0.998 - ETA: 3s - loss: 0.0295 - acc: 0.998 - ETA: 3s - loss: 0.0270 - acc: 0.998 - ETA: 3s - loss: 0.0249 - acc: 0.998 - ETA: 3s - loss: 0.0238 - acc: 0.997 - ETA: 3s - loss: 0.0225 - acc: 0.998 - ETA: 2s - loss: 0.0211 - acc: 0.998 - ETA: 2s - loss: 0.0199 - acc: 0.998 - ETA: 2s - loss: 0.0277 - acc: 0.997 - ETA: 2s - loss: 0.0263 - acc: 0.997 - ETA: 2s - loss: 0.0250 - acc: 0.998 - ETA: 2s - loss: 0.0238 - acc: 0.998 - ETA: 2s - loss: 0.0241 - acc: 0.997 - ETA: 2s - loss: 0.0244 - acc: 0.997 - ETA: 2s - loss: 0.0234 - acc: 0.997 - ETA: 2s - loss: 0.0224 - acc: 0.997 - ETA: 2s - loss: 0.0216 - acc: 0.997 - ETA: 2s - loss: 0.0208 - acc: 0.997 - ETA: 2s - loss: 0.0200 - acc: 0.997 - ETA: 2s - loss: 0.0194 - acc: 0.997 - ETA: 2s - loss: 0.0188 - acc: 0.998 - ETA: 2s - loss: 0.0182 - acc: 0.998 - ETA: 2s - loss: 0.0176 - acc: 0.998 - ETA: 1s - loss: 0.0220 - acc: 0.997 - ETA: 1s - loss: 0.0230 - acc: 0.997 - ETA: 1s - loss: 0.0275 - acc: 0.996 - ETA: 1s - loss: 0.0267 - acc: 0.996 - ETA: 1s - loss: 0.0261 - acc: 0.997 - ETA: 1s - loss: 0.0278 - acc: 0.996 - ETA: 1s - loss: 0.0271 - acc: 0.996 - ETA: 1s - loss: 0.0264 - acc: 0.997 - ETA: 1s - loss: 0.0258 - acc: 0.997 - ETA: 1s - loss: 0.0252 - acc: 0.997 - ETA: 1s - loss: 0.0246 - acc: 0.997 - ETA: 1s - loss: 0.0258 - acc: 0.997 - ETA: 1s - loss: 0.0266 - acc: 0.996 - ETA: 1s - loss: 0.0261 - acc: 0.996 - ETA: 1s - loss: 0.0255 - acc: 0.996 - ETA: 1s - loss: 0.0250 - acc: 0.996 - ETA: 1s - loss: 0.0245 - acc: 0.996 - ETA: 0s - loss: 0.0240 - acc: 0.997 - ETA: 0s - loss: 0.0268 - acc: 0.996 - ETA: 0s - loss: 0.0263 - acc: 0.996 - ETA: 0s - loss: 0.0258 - acc: 0.997 - ETA: 0s - loss: 0.0253 - acc: 0.997 - ETA: 0s - loss: 0.0269 - acc: 0.996 - ETA: 0s - loss: 0.0264 - acc: 0.997 - ETA: 0s - loss: 0.0260 - acc: 0.997 - ETA: 0s - loss: 0.0255 - acc: 0.997 - ETA: 0s - loss: 0.0257 - acc: 0.996 - ETA: 0s - loss: 0.0253 - acc: 0.996 - ETA: 0s - loss: 0.0254 - acc: 0.996 - ETA: 0s - loss: 0.0250 - acc: 0.996 - ETA: 0s - loss: 0.0248 - acc: 0.996 - ETA: 0s - loss: 0.0244 - acc: 0.996 - ETA: 0s - loss: 0.0240 - acc: 0.996 - ETA: 0s - loss: 0.0243 - acc: 0.996 - 4s 621us/step - loss: 0.0240 - acc: 0.9967 - val_loss: 1.2629 - val_acc: 0.8527 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 3s - loss: 9.0022e-06 - acc: 1.000 - ETA: 3s - loss: 9.8213e-06 - acc: 1.000 - ETA: 3s - loss: 0.0537 - acc: 0.9967 - ETA: 3s - loss: 0.0403 - acc: 0.997 - ETA: 3s - loss: 0.0322 - acc: 0.998 - ETA: 3s - loss: 0.0538 - acc: 0.996 - ETA: 3s - loss: 0.0461 - acc: 0.997 - ETA: 3s - loss: 0.0406 - acc: 0.997 - ETA: 3s - loss: 0.0361 - acc: 0.997 - ETA: 3s - loss: 0.0325 - acc: 0.998 - ETA: 3s - loss: 0.0296 - acc: 0.998 - ETA: 3s - loss: 0.0272 - acc: 0.998 - ETA: 3s - loss: 0.0251 - acc: 0.998 - ETA: 3s - loss: 0.0233 - acc: 0.998 - ETA: 3s - loss: 0.0218 - acc: 0.998 - ETA: 2s - loss: 0.0204 - acc: 0.998 - ETA: 2s - loss: 0.0192 - acc: 0.998 - ETA: 2s - loss: 0.0182 - acc: 0.998 - ETA: 2s - loss: 0.0172 - acc: 0.998 - ETA: 2s - loss: 0.0244 - acc: 0.998 - ETA: 2s - loss: 0.0235 - acc: 0.998 - ETA: 2s - loss: 0.0224 - acc: 0.998 - ETA: 2s - loss: 0.0219 - acc: 0.998 - ETA: 2s - loss: 0.0210 - acc: 0.998 - ETA: 2s - loss: 0.0201 - acc: 0.998 - ETA: 2s - loss: 0.0194 - acc: 0.998 - ETA: 2s - loss: 0.0187 - acc: 0.998 - ETA: 2s - loss: 0.0180 - acc: 0.998 - ETA: 2s - loss: 0.0174 - acc: 0.998 - ETA: 2s - loss: 0.0169 - acc: 0.998 - ETA: 2s - loss: 0.0164 - acc: 0.998 - ETA: 2s - loss: 0.0159 - acc: 0.998 - ETA: 1s - loss: 0.0154 - acc: 0.998 - ETA: 1s - loss: 0.0149 - acc: 0.998 - ETA: 1s - loss: 0.0145 - acc: 0.998 - ETA: 1s - loss: 0.0141 - acc: 0.998 - ETA: 1s - loss: 0.0137 - acc: 0.998 - ETA: 1s - loss: 0.0176 - acc: 0.998 - ETA: 1s - loss: 0.0172 - acc: 0.998 - ETA: 1s - loss: 0.0167 - acc: 0.998 - ETA: 1s - loss: 0.0203 - acc: 0.998 - ETA: 1s - loss: 0.0198 - acc: 0.998 - ETA: 1s - loss: 0.0193 - acc: 0.998 - ETA: 1s - loss: 0.0189 - acc: 0.998 - ETA: 1s - loss: 0.0185 - acc: 0.998 - ETA: 1s - loss: 0.0188 - acc: 0.998 - ETA: 1s - loss: 0.0184 - acc: 0.998 - ETA: 1s - loss: 0.0184 - acc: 0.998 - ETA: 1s - loss: 0.0180 - acc: 0.998 - ETA: 0s - loss: 0.0176 - acc: 0.998 - ETA: 0s - loss: 0.0173 - acc: 0.998 - ETA: 0s - loss: 0.0170 - acc: 0.998 - ETA: 0s - loss: 0.0166 - acc: 0.998 - ETA: 0s - loss: 0.0163 - acc: 0.998 - ETA: 0s - loss: 0.0167 - acc: 0.998 - ETA: 0s - loss: 0.0188 - acc: 0.998 - ETA: 0s - loss: 0.0185 - acc: 0.998 - ETA: 0s - loss: 0.0181 - acc: 0.998 - ETA: 0s - loss: 0.0186 - acc: 0.998 - ETA: 0s - loss: 0.0183 - acc: 0.998 - ETA: 0s - loss: 0.0180 - acc: 0.998 - ETA: 0s - loss: 0.0177 - acc: 0.998 - ETA: 0s - loss: 0.0174 - acc: 0.998 - ETA: 0s - loss: 0.0172 - acc: 0.998 - ETA: 0s - loss: 0.0169 - acc: 0.998 - ETA: 0s - loss: 0.0191 - acc: 0.998 - 4s 623us/step - loss: 0.0203 - acc: 0.9981 - val_loss: 1.2912 - val_acc: 0.8455 Epoch 00020: val_loss did not improve we are at Xception_model2 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 42s - loss: 4.9600 - acc: 0.04 - ETA: 15s - loss: 4.6583 - acc: 0.12 - ETA: 9s - loss: 4.4446 - acc: 0.1680 - ETA: 7s - loss: 4.1977 - acc: 0.227 - ETA: 5s - loss: 3.9605 - acc: 0.278 - ETA: 4s - loss: 3.7416 - acc: 0.320 - ETA: 4s - loss: 3.5504 - acc: 0.356 - ETA: 3s - loss: 3.3668 - acc: 0.391 - ETA: 3s - loss: 3.2157 - acc: 0.420 - ETA: 3s - loss: 3.0662 - acc: 0.447 - ETA: 2s - loss: 2.9345 - acc: 0.471 - ETA: 2s - loss: 2.8084 - acc: 0.494 - ETA: 2s - loss: 2.7046 - acc: 0.513 - ETA: 2s - loss: 2.6011 - acc: 0.536 - ETA: 2s - loss: 2.5122 - acc: 0.549 - ETA: 1s - loss: 2.4342 - acc: 0.560 - ETA: 1s - loss: 2.3543 - acc: 0.574 - ETA: 1s - loss: 2.2829 - acc: 0.583 - ETA: 1s - loss: 2.2169 - acc: 0.594 - ETA: 1s - loss: 2.1578 - acc: 0.601 - ETA: 1s - loss: 2.1034 - acc: 0.610 - ETA: 1s - loss: 2.0481 - acc: 0.619 - ETA: 0s - loss: 1.9970 - acc: 0.628 - ETA: 0s - loss: 1.9491 - acc: 0.635 - ETA: 0s - loss: 1.9025 - acc: 0.643 - ETA: 0s - loss: 1.8627 - acc: 0.649 - ETA: 0s - loss: 1.8189 - acc: 0.656 - ETA: 0s - loss: 1.7835 - acc: 0.661 - ETA: 0s - loss: 1.7494 - acc: 0.666 - ETA: 0s - loss: 1.7107 - acc: 0.673 - ETA: 0s - loss: 1.6789 - acc: 0.677 - ETA: 0s - loss: 1.6466 - acc: 0.683 - ETA: 0s - loss: 1.6178 - acc: 0.686 - 3s 450us/step - loss: 1.5950 - acc: 0.6897 - val_loss: 0.6722 - val_acc: 0.8228 Epoch 00001: val_loss improved from inf to 0.67221, saving model to saved_models/weights.best.Xception2.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 2s - loss: 0.5140 - acc: 0.890 - ETA: 1s - loss: 0.5573 - acc: 0.876 - ETA: 1s - loss: 0.5215 - acc: 0.880 - ETA: 1s - loss: 0.5084 - acc: 0.885 - ETA: 1s - loss: 0.5236 - acc: 0.880 - ETA: 1s - loss: 0.5267 - acc: 0.872 - ETA: 1s - loss: 0.5283 - acc: 0.867 - ETA: 1s - loss: 0.5370 - acc: 0.865 - ETA: 1s - loss: 0.5357 - acc: 0.865 - ETA: 1s - loss: 0.5403 - acc: 0.863 - ETA: 1s - loss: 0.5379 - acc: 0.864 - ETA: 1s - loss: 0.5364 - acc: 0.864 - ETA: 1s - loss: 0.5331 - acc: 0.863 - ETA: 1s - loss: 0.5251 - acc: 0.865 - ETA: 1s - loss: 0.5249 - acc: 0.864 - ETA: 1s - loss: 0.5239 - acc: 0.863 - ETA: 1s - loss: 0.5201 - acc: 0.863 - ETA: 0s - loss: 0.5145 - acc: 0.865 - ETA: 0s - loss: 0.5104 - acc: 0.865 - ETA: 0s - loss: 0.5066 - acc: 0.866 - ETA: 0s - loss: 0.5028 - acc: 0.867 - ETA: 0s - loss: 0.5024 - acc: 0.866 - ETA: 0s - loss: 0.5024 - acc: 0.865 - ETA: 0s - loss: 0.5017 - acc: 0.864 - ETA: 0s - loss: 0.4996 - acc: 0.865 - ETA: 0s - loss: 0.4991 - acc: 0.863 - ETA: 0s - loss: 0.4970 - acc: 0.864 - ETA: 0s - loss: 0.4952 - acc: 0.863 - ETA: 0s - loss: 0.4978 - acc: 0.861 - ETA: 0s - loss: 0.4933 - acc: 0.863 - ETA: 0s - loss: 0.4904 - acc: 0.864 - ETA: 0s - loss: 0.4886 - acc: 0.864 - ETA: 0s - loss: 0.4879 - acc: 0.864 - 2s 348us/step - loss: 0.4871 - acc: 0.8641 - val_loss: 0.5128 - val_acc: 0.8431 Epoch 00002: val_loss improved from 0.67221 to 0.51285, saving model to saved_models/weights.best.Xception2.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 1s - loss: 0.3565 - acc: 0.910 - ETA: 1s - loss: 0.2873 - acc: 0.940 - ETA: 1s - loss: 0.3713 - acc: 0.904 - ETA: 1s - loss: 0.3606 - acc: 0.905 - ETA: 1s - loss: 0.3475 - acc: 0.911 - ETA: 1s - loss: 0.3389 - acc: 0.912 - ETA: 1s - loss: 0.3385 - acc: 0.913 - ETA: 1s - loss: 0.3332 - acc: 0.913 - ETA: 1s - loss: 0.3340 - acc: 0.911 - ETA: 1s - loss: 0.3369 - acc: 0.911 - ETA: 1s - loss: 0.3314 - acc: 0.913 - ETA: 1s - loss: 0.3357 - acc: 0.909 - ETA: 1s - loss: 0.3317 - acc: 0.910 - ETA: 1s - loss: 0.3329 - acc: 0.910 - ETA: 1s - loss: 0.3290 - acc: 0.911 - ETA: 1s - loss: 0.3307 - acc: 0.909 - ETA: 1s - loss: 0.3309 - acc: 0.908 - ETA: 0s - loss: 0.3304 - acc: 0.906 - ETA: 0s - loss: 0.3312 - acc: 0.905 - ETA: 0s - loss: 0.3339 - acc: 0.904 - ETA: 0s - loss: 0.3382 - acc: 0.902 - ETA: 0s - loss: 0.3431 - acc: 0.901 - ETA: 0s - loss: 0.3424 - acc: 0.900 - ETA: 0s - loss: 0.3411 - acc: 0.901 - ETA: 0s - loss: 0.3434 - acc: 0.900 - ETA: 0s - loss: 0.3436 - acc: 0.899 - ETA: 0s - loss: 0.3440 - acc: 0.899 - ETA: 0s - loss: 0.3450 - acc: 0.899 - ETA: 0s - loss: 0.3433 - acc: 0.899 - ETA: 0s - loss: 0.3458 - acc: 0.898 - ETA: 0s - loss: 0.3456 - acc: 0.899 - ETA: 0s - loss: 0.3427 - acc: 0.900 - ETA: 0s - loss: 0.3440 - acc: 0.899 - 2s 342us/step - loss: 0.3433 - acc: 0.9000 - val_loss: 0.4520 - val_acc: 0.8671 Epoch 00003: val_loss improved from 0.51285 to 0.45197, saving model to saved_models/weights.best.Xception2.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 2s - loss: 0.1885 - acc: 0.940 - ETA: 1s - loss: 0.2212 - acc: 0.926 - ETA: 1s - loss: 0.2519 - acc: 0.918 - ETA: 1s - loss: 0.2482 - acc: 0.928 - ETA: 1s - loss: 0.2544 - acc: 0.922 - ETA: 1s - loss: 0.2585 - acc: 0.920 - ETA: 1s - loss: 0.2690 - acc: 0.916 - ETA: 1s - loss: 0.2675 - acc: 0.914 - ETA: 1s - loss: 0.2716 - acc: 0.912 - ETA: 1s - loss: 0.2724 - acc: 0.914 - ETA: 1s - loss: 0.2709 - acc: 0.914 - ETA: 1s - loss: 0.2728 - acc: 0.914 - ETA: 1s - loss: 0.2732 - acc: 0.916 - ETA: 1s - loss: 0.2727 - acc: 0.917 - ETA: 1s - loss: 0.2692 - acc: 0.918 - ETA: 1s - loss: 0.2700 - acc: 0.918 - ETA: 1s - loss: 0.2704 - acc: 0.917 - ETA: 0s - loss: 0.2716 - acc: 0.916 - ETA: 0s - loss: 0.2719 - acc: 0.915 - ETA: 0s - loss: 0.2721 - acc: 0.915 - ETA: 0s - loss: 0.2745 - acc: 0.915 - ETA: 0s - loss: 0.2734 - acc: 0.916 - ETA: 0s - loss: 0.2728 - acc: 0.916 - ETA: 0s - loss: 0.2739 - acc: 0.915 - ETA: 0s - loss: 0.2750 - acc: 0.915 - ETA: 0s - loss: 0.2759 - acc: 0.915 - ETA: 0s - loss: 0.2741 - acc: 0.916 - ETA: 0s - loss: 0.2725 - acc: 0.916 - ETA: 0s - loss: 0.2714 - acc: 0.916 - ETA: 0s - loss: 0.2710 - acc: 0.916 - ETA: 0s - loss: 0.2731 - acc: 0.916 - ETA: 0s - loss: 0.2720 - acc: 0.917 - ETA: 0s - loss: 0.2698 - acc: 0.917 - 2s 343us/step - loss: 0.2684 - acc: 0.9186 - val_loss: 0.4608 - val_acc: 0.8443 Epoch 00004: val_loss did not improve Epoch 5/20 6680/6680 [==============================] - ETA: 2s - loss: 0.2672 - acc: 0.880 - ETA: 2s - loss: 0.2390 - acc: 0.896 - ETA: 2s - loss: 0.2362 - acc: 0.898 - ETA: 2s - loss: 0.2211 - acc: 0.914 - ETA: 1s - loss: 0.2170 - acc: 0.922 - ETA: 1s - loss: 0.2195 - acc: 0.924 - ETA: 1s - loss: 0.2190 - acc: 0.928 - ETA: 1s - loss: 0.2172 - acc: 0.931 - ETA: 1s - loss: 0.2244 - acc: 0.928 - ETA: 1s - loss: 0.2247 - acc: 0.928 - ETA: 1s - loss: 0.2211 - acc: 0.931 - ETA: 1s - loss: 0.2201 - acc: 0.931 - ETA: 1s - loss: 0.2177 - acc: 0.933 - ETA: 1s - loss: 0.2158 - acc: 0.933 - ETA: 1s - loss: 0.2164 - acc: 0.933 - ETA: 1s - loss: 0.2172 - acc: 0.933 - ETA: 1s - loss: 0.2162 - acc: 0.934 - ETA: 1s - loss: 0.2148 - acc: 0.935 - ETA: 0s - loss: 0.2160 - acc: 0.935 - ETA: 0s - loss: 0.2140 - acc: 0.936 - ETA: 0s - loss: 0.2140 - acc: 0.936 - ETA: 0s - loss: 0.2149 - acc: 0.936 - ETA: 0s - loss: 0.2139 - acc: 0.936 - ETA: 0s - loss: 0.2144 - acc: 0.935 - ETA: 0s - loss: 0.2140 - acc: 0.936 - ETA: 0s - loss: 0.2150 - acc: 0.936 - ETA: 0s - loss: 0.2162 - acc: 0.936 - ETA: 0s - loss: 0.2148 - acc: 0.937 - ETA: 0s - loss: 0.2142 - acc: 0.937 - ETA: 0s - loss: 0.2123 - acc: 0.937 - ETA: 0s - loss: 0.2139 - acc: 0.937 - ETA: 0s - loss: 0.2138 - acc: 0.936 - ETA: 0s - loss: 0.2153 - acc: 0.936 - 2s 347us/step - loss: 0.2163 - acc: 0.9356 - val_loss: 0.4646 - val_acc: 0.8515 Epoch 00005: val_loss did not improve Epoch 6/20 6680/6680 [==============================] - ETA: 1s - loss: 0.2299 - acc: 0.940 - ETA: 1s - loss: 0.1820 - acc: 0.946 - ETA: 1s - loss: 0.1732 - acc: 0.952 - ETA: 1s - loss: 0.1665 - acc: 0.954 - ETA: 1s - loss: 0.1696 - acc: 0.952 - ETA: 1s - loss: 0.1599 - acc: 0.957 - ETA: 1s - loss: 0.1785 - acc: 0.948 - ETA: 1s - loss: 0.1788 - acc: 0.948 - ETA: 1s - loss: 0.1755 - acc: 0.949 - ETA: 1s - loss: 0.1772 - acc: 0.949 - ETA: 1s - loss: 0.1818 - acc: 0.948 - ETA: 1s - loss: 0.1767 - acc: 0.950 - ETA: 1s - loss: 0.1818 - acc: 0.950 - ETA: 1s - loss: 0.1821 - acc: 0.950 - ETA: 1s - loss: 0.1818 - acc: 0.950 - ETA: 1s - loss: 0.1807 - acc: 0.951 - ETA: 1s - loss: 0.1794 - acc: 0.950 - ETA: 1s - loss: 0.1785 - acc: 0.950 - ETA: 0s - loss: 0.1762 - acc: 0.951 - ETA: 0s - loss: 0.1768 - acc: 0.951 - ETA: 0s - loss: 0.1776 - acc: 0.950 - ETA: 0s - loss: 0.1791 - acc: 0.949 - ETA: 0s - loss: 0.1799 - acc: 0.949 - ETA: 0s - loss: 0.1815 - acc: 0.948 - ETA: 0s - loss: 0.1832 - acc: 0.948 - ETA: 0s - loss: 0.1810 - acc: 0.948 - ETA: 0s - loss: 0.1804 - acc: 0.948 - ETA: 0s - loss: 0.1791 - acc: 0.949 - ETA: 0s - loss: 0.1789 - acc: 0.949 - ETA: 0s - loss: 0.1789 - acc: 0.948 - ETA: 0s - loss: 0.1792 - acc: 0.949 - ETA: 0s - loss: 0.1810 - acc: 0.947 - ETA: 0s - loss: 0.1811 - acc: 0.948 - 2s 347us/step - loss: 0.1801 - acc: 0.9487 - val_loss: 0.4403 - val_acc: 0.8563 Epoch 00006: val_loss improved from 0.45197 to 0.44031, saving model to saved_models/weights.best.Xception2.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 2s - loss: 0.1074 - acc: 0.960 - ETA: 2s - loss: 0.1449 - acc: 0.970 - ETA: 2s - loss: 0.1370 - acc: 0.970 - ETA: 1s - loss: 0.1372 - acc: 0.967 - ETA: 1s - loss: 0.1412 - acc: 0.967 - ETA: 1s - loss: 0.1460 - acc: 0.966 - ETA: 1s - loss: 0.1439 - acc: 0.966 - ETA: 1s - loss: 0.1437 - acc: 0.965 - ETA: 1s - loss: 0.1454 - acc: 0.964 - ETA: 1s - loss: 0.1483 - acc: 0.962 - ETA: 1s - loss: 0.1497 - acc: 0.961 - ETA: 1s - loss: 0.1562 - acc: 0.959 - ETA: 1s - loss: 0.1517 - acc: 0.960 - ETA: 1s - loss: 0.1511 - acc: 0.959 - ETA: 1s - loss: 0.1513 - acc: 0.959 - ETA: 1s - loss: 0.1478 - acc: 0.961 - ETA: 1s - loss: 0.1480 - acc: 0.961 - ETA: 1s - loss: 0.1455 - acc: 0.962 - ETA: 0s - loss: 0.1474 - acc: 0.960 - ETA: 0s - loss: 0.1487 - acc: 0.960 - ETA: 0s - loss: 0.1486 - acc: 0.958 - ETA: 0s - loss: 0.1488 - acc: 0.958 - ETA: 0s - loss: 0.1484 - acc: 0.958 - ETA: 0s - loss: 0.1494 - acc: 0.959 - ETA: 0s - loss: 0.1480 - acc: 0.960 - ETA: 0s - loss: 0.1493 - acc: 0.959 - ETA: 0s - loss: 0.1484 - acc: 0.960 - ETA: 0s - loss: 0.1505 - acc: 0.959 - ETA: 0s - loss: 0.1508 - acc: 0.959 - ETA: 0s - loss: 0.1522 - acc: 0.959 - ETA: 0s - loss: 0.1524 - acc: 0.959 - ETA: 0s - loss: 0.1517 - acc: 0.959 - ETA: 0s - loss: 0.1516 - acc: 0.959 - 2s 355us/step - loss: 0.1514 - acc: 0.9597 - val_loss: 0.4436 - val_acc: 0.8563 Epoch 00007: val_loss did not improve Epoch 8/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0874 - acc: 0.980 - ETA: 1s - loss: 0.1021 - acc: 0.983 - ETA: 1s - loss: 0.1071 - acc: 0.982 - ETA: 1s - loss: 0.1182 - acc: 0.971 - ETA: 1s - loss: 0.1190 - acc: 0.972 - ETA: 1s - loss: 0.1206 - acc: 0.970 - ETA: 1s - loss: 0.1220 - acc: 0.966 - ETA: 1s - loss: 0.1176 - acc: 0.969 - ETA: 1s - loss: 0.1157 - acc: 0.971 - ETA: 1s - loss: 0.1189 - acc: 0.970 - ETA: 1s - loss: 0.1235 - acc: 0.968 - ETA: 1s - loss: 0.1240 - acc: 0.967 - ETA: 1s - loss: 0.1225 - acc: 0.968 - ETA: 1s - loss: 0.1229 - acc: 0.968 - ETA: 1s - loss: 0.1218 - acc: 0.969 - ETA: 1s - loss: 0.1221 - acc: 0.969 - ETA: 1s - loss: 0.1199 - acc: 0.970 - ETA: 0s - loss: 0.1233 - acc: 0.968 - ETA: 0s - loss: 0.1240 - acc: 0.968 - ETA: 0s - loss: 0.1241 - acc: 0.967 - ETA: 0s - loss: 0.1251 - acc: 0.967 - ETA: 0s - loss: 0.1261 - acc: 0.967 - ETA: 0s - loss: 0.1257 - acc: 0.967 - ETA: 0s - loss: 0.1272 - acc: 0.967 - ETA: 0s - loss: 0.1271 - acc: 0.966 - ETA: 0s - loss: 0.1268 - acc: 0.966 - ETA: 0s - loss: 0.1266 - acc: 0.967 - ETA: 0s - loss: 0.1264 - acc: 0.966 - ETA: 0s - loss: 0.1278 - acc: 0.966 - ETA: 0s - loss: 0.1282 - acc: 0.966 - ETA: 0s - loss: 0.1276 - acc: 0.966 - ETA: 0s - loss: 0.1283 - acc: 0.966 - ETA: 0s - loss: 0.1282 - acc: 0.966 - 2s 347us/step - loss: 0.1292 - acc: 0.9662 - val_loss: 0.4647 - val_acc: 0.8611 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0829 - acc: 0.990 - ETA: 2s - loss: 0.1118 - acc: 0.973 - ETA: 1s - loss: 0.1034 - acc: 0.978 - ETA: 1s - loss: 0.1003 - acc: 0.978 - ETA: 1s - loss: 0.1015 - acc: 0.980 - ETA: 1s - loss: 0.1039 - acc: 0.977 - ETA: 1s - loss: 0.1050 - acc: 0.975 - ETA: 1s - loss: 0.1078 - acc: 0.974 - ETA: 1s - loss: 0.1061 - acc: 0.974 - ETA: 1s - loss: 0.1056 - acc: 0.975 - ETA: 1s - loss: 0.1028 - acc: 0.976 - ETA: 1s - loss: 0.1033 - acc: 0.975 - ETA: 1s - loss: 0.1031 - acc: 0.974 - ETA: 1s - loss: 0.1048 - acc: 0.974 - ETA: 1s - loss: 0.1046 - acc: 0.973 - ETA: 1s - loss: 0.1069 - acc: 0.972 - ETA: 1s - loss: 0.1094 - acc: 0.971 - ETA: 1s - loss: 0.1095 - acc: 0.971 - ETA: 0s - loss: 0.1099 - acc: 0.971 - ETA: 0s - loss: 0.1105 - acc: 0.971 - ETA: 0s - loss: 0.1107 - acc: 0.970 - ETA: 0s - loss: 0.1096 - acc: 0.971 - ETA: 0s - loss: 0.1087 - acc: 0.971 - ETA: 0s - loss: 0.1086 - acc: 0.971 - ETA: 0s - loss: 0.1078 - acc: 0.971 - ETA: 0s - loss: 0.1082 - acc: 0.971 - ETA: 0s - loss: 0.1104 - acc: 0.970 - ETA: 0s - loss: 0.1098 - acc: 0.970 - ETA: 0s - loss: 0.1116 - acc: 0.970 - ETA: 0s - loss: 0.1126 - acc: 0.969 - ETA: 0s - loss: 0.1118 - acc: 0.970 - ETA: 0s - loss: 0.1115 - acc: 0.970 - ETA: 0s - loss: 0.1110 - acc: 0.970 - 2s 348us/step - loss: 0.1106 - acc: 0.9708 - val_loss: 0.4453 - val_acc: 0.8623 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0679 - acc: 1.000 - ETA: 2s - loss: 0.0741 - acc: 0.990 - ETA: 1s - loss: 0.0815 - acc: 0.980 - ETA: 1s - loss: 0.0739 - acc: 0.984 - ETA: 1s - loss: 0.0781 - acc: 0.983 - ETA: 1s - loss: 0.0780 - acc: 0.982 - ETA: 1s - loss: 0.0825 - acc: 0.979 - ETA: 1s - loss: 0.0852 - acc: 0.977 - ETA: 1s - loss: 0.0851 - acc: 0.977 - ETA: 1s - loss: 0.0865 - acc: 0.977 - ETA: 1s - loss: 0.0866 - acc: 0.977 - ETA: 1s - loss: 0.0876 - acc: 0.978 - ETA: 1s - loss: 0.0855 - acc: 0.979 - ETA: 1s - loss: 0.0877 - acc: 0.979 - ETA: 1s - loss: 0.0864 - acc: 0.979 - ETA: 1s - loss: 0.0863 - acc: 0.980 - ETA: 1s - loss: 0.0863 - acc: 0.980 - ETA: 1s - loss: 0.0864 - acc: 0.980 - ETA: 0s - loss: 0.0868 - acc: 0.979 - ETA: 0s - loss: 0.0883 - acc: 0.979 - ETA: 0s - loss: 0.0906 - acc: 0.977 - ETA: 0s - loss: 0.0896 - acc: 0.978 - ETA: 0s - loss: 0.0891 - acc: 0.978 - ETA: 0s - loss: 0.0894 - acc: 0.978 - ETA: 0s - loss: 0.0914 - acc: 0.978 - ETA: 0s - loss: 0.0913 - acc: 0.977 - ETA: 0s - loss: 0.0912 - acc: 0.978 - ETA: 0s - loss: 0.0916 - acc: 0.977 - ETA: 0s - loss: 0.0916 - acc: 0.977 - ETA: 0s - loss: 0.0916 - acc: 0.977 - ETA: 0s - loss: 0.0931 - acc: 0.976 - ETA: 0s - loss: 0.0928 - acc: 0.976 - ETA: 0s - loss: 0.0930 - acc: 0.976 - 2s 359us/step - loss: 0.0933 - acc: 0.9757 - val_loss: 0.4544 - val_acc: 0.8575 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0354 - acc: 1.000 - ETA: 2s - loss: 0.0690 - acc: 0.990 - ETA: 1s - loss: 0.0657 - acc: 0.988 - ETA: 1s - loss: 0.0654 - acc: 0.987 - ETA: 1s - loss: 0.0680 - acc: 0.985 - ETA: 1s - loss: 0.0676 - acc: 0.987 - ETA: 1s - loss: 0.0648 - acc: 0.989 - ETA: 1s - loss: 0.0628 - acc: 0.989 - ETA: 1s - loss: 0.0652 - acc: 0.987 - ETA: 1s - loss: 0.0645 - acc: 0.987 - ETA: 1s - loss: 0.0633 - acc: 0.987 - ETA: 1s - loss: 0.0647 - acc: 0.987 - ETA: 1s - loss: 0.0676 - acc: 0.986 - ETA: 1s - loss: 0.0708 - acc: 0.984 - ETA: 1s - loss: 0.0705 - acc: 0.984 - ETA: 1s - loss: 0.0708 - acc: 0.984 - ETA: 1s - loss: 0.0710 - acc: 0.984 - ETA: 1s - loss: 0.0718 - acc: 0.984 - ETA: 0s - loss: 0.0725 - acc: 0.984 - ETA: 0s - loss: 0.0727 - acc: 0.984 - ETA: 0s - loss: 0.0746 - acc: 0.983 - ETA: 0s - loss: 0.0762 - acc: 0.982 - ETA: 0s - loss: 0.0780 - acc: 0.982 - ETA: 0s - loss: 0.0781 - acc: 0.982 - ETA: 0s - loss: 0.0773 - acc: 0.982 - ETA: 0s - loss: 0.0784 - acc: 0.982 - ETA: 0s - loss: 0.0776 - acc: 0.982 - ETA: 0s - loss: 0.0783 - acc: 0.981 - ETA: 0s - loss: 0.0783 - acc: 0.981 - ETA: 0s - loss: 0.0796 - acc: 0.981 - ETA: 0s - loss: 0.0807 - acc: 0.980 - ETA: 0s - loss: 0.0810 - acc: 0.980 - ETA: 0s - loss: 0.0822 - acc: 0.980 - 2s 366us/step - loss: 0.0816 - acc: 0.9802 - val_loss: 0.4706 - val_acc: 0.8467 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0552 - acc: 1.000 - ETA: 2s - loss: 0.0581 - acc: 0.993 - ETA: 2s - loss: 0.0579 - acc: 0.992 - ETA: 1s - loss: 0.0548 - acc: 0.990 - ETA: 1s - loss: 0.0545 - acc: 0.991 - ETA: 1s - loss: 0.0554 - acc: 0.991 - ETA: 1s - loss: 0.0561 - acc: 0.990 - ETA: 1s - loss: 0.0559 - acc: 0.990 - ETA: 1s - loss: 0.0568 - acc: 0.989 - ETA: 1s - loss: 0.0556 - acc: 0.990 - ETA: 1s - loss: 0.0619 - acc: 0.988 - ETA: 1s - loss: 0.0622 - acc: 0.988 - ETA: 1s - loss: 0.0621 - acc: 0.988 - ETA: 1s - loss: 0.0629 - acc: 0.988 - ETA: 1s - loss: 0.0625 - acc: 0.989 - ETA: 1s - loss: 0.0630 - acc: 0.988 - ETA: 1s - loss: 0.0641 - acc: 0.987 - ETA: 1s - loss: 0.0640 - acc: 0.987 - ETA: 1s - loss: 0.0651 - acc: 0.986 - ETA: 0s - loss: 0.0662 - acc: 0.986 - ETA: 0s - loss: 0.0654 - acc: 0.986 - ETA: 0s - loss: 0.0655 - acc: 0.986 - ETA: 0s - loss: 0.0659 - acc: 0.986 - ETA: 0s - loss: 0.0659 - acc: 0.986 - ETA: 0s - loss: 0.0661 - acc: 0.986 - ETA: 0s - loss: 0.0668 - acc: 0.985 - ETA: 0s - loss: 0.0675 - acc: 0.985 - ETA: 0s - loss: 0.0680 - acc: 0.985 - ETA: 0s - loss: 0.0683 - acc: 0.984 - ETA: 0s - loss: 0.0682 - acc: 0.984 - ETA: 0s - loss: 0.0686 - acc: 0.984 - ETA: 0s - loss: 0.0690 - acc: 0.984 - ETA: 0s - loss: 0.0707 - acc: 0.983 - 2s 372us/step - loss: 0.0708 - acc: 0.9838 - val_loss: 0.4567 - val_acc: 0.8599 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0370 - acc: 1.000 - ETA: 2s - loss: 0.0490 - acc: 0.990 - ETA: 1s - loss: 0.0510 - acc: 0.990 - ETA: 1s - loss: 0.0531 - acc: 0.990 - ETA: 1s - loss: 0.0537 - acc: 0.991 - ETA: 1s - loss: 0.0586 - acc: 0.988 - ETA: 1s - loss: 0.0583 - acc: 0.989 - ETA: 1s - loss: 0.0595 - acc: 0.988 - ETA: 1s - loss: 0.0570 - acc: 0.989 - ETA: 1s - loss: 0.0572 - acc: 0.990 - ETA: 1s - loss: 0.0601 - acc: 0.988 - ETA: 1s - loss: 0.0585 - acc: 0.989 - ETA: 1s - loss: 0.0620 - acc: 0.988 - ETA: 1s - loss: 0.0605 - acc: 0.988 - ETA: 1s - loss: 0.0596 - acc: 0.988 - ETA: 1s - loss: 0.0594 - acc: 0.988 - ETA: 1s - loss: 0.0595 - acc: 0.987 - ETA: 1s - loss: 0.0584 - acc: 0.988 - ETA: 0s - loss: 0.0610 - acc: 0.987 - ETA: 0s - loss: 0.0607 - acc: 0.988 - ETA: 0s - loss: 0.0617 - acc: 0.988 - ETA: 0s - loss: 0.0615 - acc: 0.987 - ETA: 0s - loss: 0.0610 - acc: 0.988 - ETA: 0s - loss: 0.0607 - acc: 0.988 - ETA: 0s - loss: 0.0603 - acc: 0.988 - ETA: 0s - loss: 0.0602 - acc: 0.988 - ETA: 0s - loss: 0.0614 - acc: 0.987 - ETA: 0s - loss: 0.0607 - acc: 0.988 - ETA: 0s - loss: 0.0599 - acc: 0.988 - ETA: 0s - loss: 0.0602 - acc: 0.988 - ETA: 0s - loss: 0.0603 - acc: 0.987 - ETA: 0s - loss: 0.0599 - acc: 0.987 - ETA: 0s - loss: 0.0603 - acc: 0.987 - 2s 351us/step - loss: 0.0598 - acc: 0.9879 - val_loss: 0.4820 - val_acc: 0.8575 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0459 - acc: 0.990 - ETA: 2s - loss: 0.0439 - acc: 0.990 - ETA: 1s - loss: 0.0384 - acc: 0.990 - ETA: 1s - loss: 0.0426 - acc: 0.991 - ETA: 1s - loss: 0.0476 - acc: 0.988 - ETA: 1s - loss: 0.0479 - acc: 0.988 - ETA: 1s - loss: 0.0462 - acc: 0.989 - ETA: 1s - loss: 0.0449 - acc: 0.990 - ETA: 1s - loss: 0.0448 - acc: 0.990 - ETA: 1s - loss: 0.0442 - acc: 0.991 - ETA: 1s - loss: 0.0475 - acc: 0.991 - ETA: 1s - loss: 0.0487 - acc: 0.991 - ETA: 1s - loss: 0.0493 - acc: 0.992 - ETA: 1s - loss: 0.0510 - acc: 0.990 - ETA: 1s - loss: 0.0508 - acc: 0.991 - ETA: 1s - loss: 0.0522 - acc: 0.990 - ETA: 1s - loss: 0.0509 - acc: 0.990 - ETA: 1s - loss: 0.0503 - acc: 0.991 - ETA: 0s - loss: 0.0506 - acc: 0.990 - ETA: 0s - loss: 0.0507 - acc: 0.991 - ETA: 0s - loss: 0.0513 - acc: 0.991 - ETA: 0s - loss: 0.0505 - acc: 0.991 - ETA: 0s - loss: 0.0498 - acc: 0.991 - ETA: 0s - loss: 0.0503 - acc: 0.990 - ETA: 0s - loss: 0.0497 - acc: 0.991 - ETA: 0s - loss: 0.0504 - acc: 0.991 - ETA: 0s - loss: 0.0502 - acc: 0.991 - ETA: 0s - loss: 0.0508 - acc: 0.990 - ETA: 0s - loss: 0.0505 - acc: 0.991 - ETA: 0s - loss: 0.0506 - acc: 0.990 - ETA: 0s - loss: 0.0512 - acc: 0.990 - ETA: 0s - loss: 0.0527 - acc: 0.990 - ETA: 0s - loss: 0.0528 - acc: 0.990 - 2s 348us/step - loss: 0.0525 - acc: 0.9904 - val_loss: 0.4850 - val_acc: 0.8515 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 1s - loss: 0.0592 - acc: 0.960 - ETA: 2s - loss: 0.0518 - acc: 0.980 - ETA: 1s - loss: 0.0474 - acc: 0.984 - ETA: 1s - loss: 0.0439 - acc: 0.987 - ETA: 1s - loss: 0.0428 - acc: 0.987 - ETA: 1s - loss: 0.0424 - acc: 0.990 - ETA: 1s - loss: 0.0421 - acc: 0.990 - ETA: 1s - loss: 0.0425 - acc: 0.991 - ETA: 1s - loss: 0.0430 - acc: 0.991 - ETA: 1s - loss: 0.0415 - acc: 0.992 - ETA: 1s - loss: 0.0406 - acc: 0.992 - ETA: 1s - loss: 0.0410 - acc: 0.992 - ETA: 1s - loss: 0.0413 - acc: 0.992 - ETA: 1s - loss: 0.0433 - acc: 0.991 - ETA: 1s - loss: 0.0431 - acc: 0.991 - ETA: 1s - loss: 0.0421 - acc: 0.991 - ETA: 1s - loss: 0.0415 - acc: 0.992 - ETA: 1s - loss: 0.0411 - acc: 0.992 - ETA: 0s - loss: 0.0414 - acc: 0.991 - ETA: 0s - loss: 0.0417 - acc: 0.991 - ETA: 0s - loss: 0.0422 - acc: 0.991 - ETA: 0s - loss: 0.0416 - acc: 0.991 - ETA: 0s - loss: 0.0421 - acc: 0.991 - ETA: 0s - loss: 0.0423 - acc: 0.991 - ETA: 0s - loss: 0.0430 - acc: 0.990 - ETA: 0s - loss: 0.0439 - acc: 0.990 - ETA: 0s - loss: 0.0447 - acc: 0.990 - ETA: 0s - loss: 0.0456 - acc: 0.990 - ETA: 0s - loss: 0.0470 - acc: 0.989 - ETA: 0s - loss: 0.0465 - acc: 0.989 - ETA: 0s - loss: 0.0461 - acc: 0.989 - ETA: 0s - loss: 0.0457 - acc: 0.989 - ETA: 0s - loss: 0.0472 - acc: 0.989 - 2s 359us/step - loss: 0.0473 - acc: 0.9894 - val_loss: 0.4893 - val_acc: 0.8635 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0715 - acc: 0.970 - ETA: 2s - loss: 0.0424 - acc: 0.990 - ETA: 2s - loss: 0.0382 - acc: 0.990 - ETA: 2s - loss: 0.0346 - acc: 0.992 - ETA: 1s - loss: 0.0340 - acc: 0.993 - ETA: 1s - loss: 0.0327 - acc: 0.994 - ETA: 1s - loss: 0.0329 - acc: 0.993 - ETA: 1s - loss: 0.0324 - acc: 0.994 - ETA: 1s - loss: 0.0341 - acc: 0.994 - ETA: 1s - loss: 0.0336 - acc: 0.994 - ETA: 1s - loss: 0.0350 - acc: 0.993 - ETA: 1s - loss: 0.0354 - acc: 0.993 - ETA: 1s - loss: 0.0369 - acc: 0.993 - ETA: 1s - loss: 0.0372 - acc: 0.993 - ETA: 1s - loss: 0.0369 - acc: 0.993 - ETA: 1s - loss: 0.0373 - acc: 0.994 - ETA: 1s - loss: 0.0364 - acc: 0.994 - ETA: 1s - loss: 0.0361 - acc: 0.994 - ETA: 0s - loss: 0.0364 - acc: 0.993 - ETA: 0s - loss: 0.0365 - acc: 0.993 - ETA: 0s - loss: 0.0387 - acc: 0.993 - ETA: 0s - loss: 0.0394 - acc: 0.993 - ETA: 0s - loss: 0.0387 - acc: 0.993 - ETA: 0s - loss: 0.0385 - acc: 0.993 - ETA: 0s - loss: 0.0402 - acc: 0.993 - ETA: 0s - loss: 0.0408 - acc: 0.993 - ETA: 0s - loss: 0.0406 - acc: 0.993 - ETA: 0s - loss: 0.0402 - acc: 0.993 - ETA: 0s - loss: 0.0401 - acc: 0.993 - ETA: 0s - loss: 0.0397 - acc: 0.993 - ETA: 0s - loss: 0.0398 - acc: 0.993 - ETA: 0s - loss: 0.0399 - acc: 0.993 - ETA: 0s - loss: 0.0405 - acc: 0.993 - 3s 375us/step - loss: 0.0402 - acc: 0.9933 - val_loss: 0.5054 - val_acc: 0.8611 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0254 - acc: 1.000 - ETA: 2s - loss: 0.0360 - acc: 0.990 - ETA: 2s - loss: 0.0361 - acc: 0.988 - ETA: 2s - loss: 0.0321 - acc: 0.991 - ETA: 1s - loss: 0.0314 - acc: 0.992 - ETA: 1s - loss: 0.0354 - acc: 0.992 - ETA: 1s - loss: 0.0352 - acc: 0.992 - ETA: 1s - loss: 0.0352 - acc: 0.992 - ETA: 1s - loss: 0.0336 - acc: 0.993 - ETA: 1s - loss: 0.0333 - acc: 0.993 - ETA: 1s - loss: 0.0328 - acc: 0.993 - ETA: 1s - loss: 0.0325 - acc: 0.993 - ETA: 1s - loss: 0.0324 - acc: 0.994 - ETA: 1s - loss: 0.0321 - acc: 0.994 - ETA: 1s - loss: 0.0322 - acc: 0.993 - ETA: 1s - loss: 0.0321 - acc: 0.994 - ETA: 1s - loss: 0.0317 - acc: 0.994 - ETA: 1s - loss: 0.0329 - acc: 0.994 - ETA: 1s - loss: 0.0328 - acc: 0.994 - ETA: 0s - loss: 0.0328 - acc: 0.994 - ETA: 0s - loss: 0.0328 - acc: 0.994 - ETA: 0s - loss: 0.0326 - acc: 0.994 - ETA: 0s - loss: 0.0328 - acc: 0.994 - ETA: 0s - loss: 0.0326 - acc: 0.994 - ETA: 0s - loss: 0.0324 - acc: 0.994 - ETA: 0s - loss: 0.0324 - acc: 0.994 - ETA: 0s - loss: 0.0328 - acc: 0.994 - ETA: 0s - loss: 0.0331 - acc: 0.994 - ETA: 0s - loss: 0.0331 - acc: 0.994 - ETA: 0s - loss: 0.0343 - acc: 0.993 - ETA: 0s - loss: 0.0350 - acc: 0.993 - ETA: 0s - loss: 0.0353 - acc: 0.993 - ETA: 0s - loss: 0.0354 - acc: 0.993 - 2s 374us/step - loss: 0.0354 - acc: 0.9933 - val_loss: 0.5075 - val_acc: 0.8551 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0293 - acc: 0.990 - ETA: 2s - loss: 0.0287 - acc: 0.993 - ETA: 2s - loss: 0.0276 - acc: 0.994 - ETA: 2s - loss: 0.0280 - acc: 0.994 - ETA: 1s - loss: 0.0278 - acc: 0.994 - ETA: 1s - loss: 0.0284 - acc: 0.995 - ETA: 1s - loss: 0.0282 - acc: 0.995 - ETA: 1s - loss: 0.0280 - acc: 0.996 - ETA: 1s - loss: 0.0275 - acc: 0.995 - ETA: 1s - loss: 0.0292 - acc: 0.994 - ETA: 1s - loss: 0.0313 - acc: 0.993 - ETA: 1s - loss: 0.0310 - acc: 0.993 - ETA: 1s - loss: 0.0306 - acc: 0.994 - ETA: 1s - loss: 0.0312 - acc: 0.993 - ETA: 1s - loss: 0.0331 - acc: 0.993 - ETA: 1s - loss: 0.0327 - acc: 0.993 - ETA: 1s - loss: 0.0330 - acc: 0.993 - ETA: 1s - loss: 0.0326 - acc: 0.993 - ETA: 0s - loss: 0.0324 - acc: 0.994 - ETA: 0s - loss: 0.0337 - acc: 0.993 - ETA: 0s - loss: 0.0335 - acc: 0.993 - ETA: 0s - loss: 0.0331 - acc: 0.994 - ETA: 0s - loss: 0.0327 - acc: 0.994 - ETA: 0s - loss: 0.0324 - acc: 0.994 - ETA: 0s - loss: 0.0322 - acc: 0.994 - ETA: 0s - loss: 0.0323 - acc: 0.994 - ETA: 0s - loss: 0.0332 - acc: 0.994 - ETA: 0s - loss: 0.0328 - acc: 0.994 - ETA: 0s - loss: 0.0326 - acc: 0.994 - ETA: 0s - loss: 0.0329 - acc: 0.994 - ETA: 0s - loss: 0.0325 - acc: 0.994 - ETA: 0s - loss: 0.0321 - acc: 0.994 - ETA: 0s - loss: 0.0324 - acc: 0.994 - 2s 368us/step - loss: 0.0326 - acc: 0.9943 - val_loss: 0.5151 - val_acc: 0.8503 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0157 - acc: 1.000 - ETA: 2s - loss: 0.0192 - acc: 0.996 - ETA: 2s - loss: 0.0184 - acc: 0.998 - ETA: 1s - loss: 0.0197 - acc: 0.997 - ETA: 1s - loss: 0.0192 - acc: 0.997 - ETA: 1s - loss: 0.0314 - acc: 0.992 - ETA: 1s - loss: 0.0303 - acc: 0.993 - ETA: 1s - loss: 0.0284 - acc: 0.994 - ETA: 1s - loss: 0.0292 - acc: 0.994 - ETA: 1s - loss: 0.0279 - acc: 0.994 - ETA: 1s - loss: 0.0273 - acc: 0.995 - ETA: 1s - loss: 0.0276 - acc: 0.994 - ETA: 1s - loss: 0.0272 - acc: 0.994 - ETA: 1s - loss: 0.0268 - acc: 0.995 - ETA: 1s - loss: 0.0266 - acc: 0.995 - ETA: 1s - loss: 0.0264 - acc: 0.995 - ETA: 1s - loss: 0.0259 - acc: 0.995 - ETA: 1s - loss: 0.0281 - acc: 0.995 - ETA: 0s - loss: 0.0282 - acc: 0.995 - ETA: 0s - loss: 0.0277 - acc: 0.995 - ETA: 0s - loss: 0.0275 - acc: 0.995 - ETA: 0s - loss: 0.0276 - acc: 0.995 - ETA: 0s - loss: 0.0276 - acc: 0.995 - ETA: 0s - loss: 0.0276 - acc: 0.995 - ETA: 0s - loss: 0.0282 - acc: 0.995 - ETA: 0s - loss: 0.0288 - acc: 0.994 - ETA: 0s - loss: 0.0286 - acc: 0.994 - ETA: 0s - loss: 0.0287 - acc: 0.994 - ETA: 0s - loss: 0.0286 - acc: 0.994 - ETA: 0s - loss: 0.0289 - acc: 0.994 - ETA: 0s - loss: 0.0290 - acc: 0.994 - ETA: 0s - loss: 0.0290 - acc: 0.994 - ETA: 0s - loss: 0.0288 - acc: 0.994 - 2s 367us/step - loss: 0.0291 - acc: 0.9946 - val_loss: 0.5188 - val_acc: 0.8563 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0314 - acc: 0.990 - ETA: 2s - loss: 0.0282 - acc: 0.993 - ETA: 2s - loss: 0.0231 - acc: 0.996 - ETA: 2s - loss: 0.0224 - acc: 0.997 - ETA: 1s - loss: 0.0242 - acc: 0.996 - ETA: 1s - loss: 0.0240 - acc: 0.997 - ETA: 1s - loss: 0.0261 - acc: 0.996 - ETA: 1s - loss: 0.0262 - acc: 0.996 - ETA: 1s - loss: 0.0252 - acc: 0.996 - ETA: 1s - loss: 0.0247 - acc: 0.996 - ETA: 1s - loss: 0.0237 - acc: 0.997 - ETA: 1s - loss: 0.0248 - acc: 0.995 - ETA: 1s - loss: 0.0238 - acc: 0.996 - ETA: 1s - loss: 0.0234 - acc: 0.996 - ETA: 1s - loss: 0.0239 - acc: 0.996 - ETA: 1s - loss: 0.0237 - acc: 0.996 - ETA: 1s - loss: 0.0230 - acc: 0.996 - ETA: 1s - loss: 0.0230 - acc: 0.996 - ETA: 1s - loss: 0.0226 - acc: 0.997 - ETA: 0s - loss: 0.0232 - acc: 0.996 - ETA: 0s - loss: 0.0228 - acc: 0.996 - ETA: 0s - loss: 0.0230 - acc: 0.996 - ETA: 0s - loss: 0.0232 - acc: 0.996 - ETA: 0s - loss: 0.0234 - acc: 0.996 - ETA: 0s - loss: 0.0238 - acc: 0.996 - ETA: 0s - loss: 0.0247 - acc: 0.995 - ETA: 0s - loss: 0.0243 - acc: 0.996 - ETA: 0s - loss: 0.0259 - acc: 0.996 - ETA: 0s - loss: 0.0257 - acc: 0.996 - ETA: 0s - loss: 0.0253 - acc: 0.996 - ETA: 0s - loss: 0.0252 - acc: 0.996 - ETA: 0s - loss: 0.0251 - acc: 0.996 - ETA: 0s - loss: 0.0252 - acc: 0.996 - 2s 371us/step - loss: 0.0253 - acc: 0.9963 - val_loss: 0.5310 - val_acc: 0.8563 Epoch 00020: val_loss did not improve we are at Xception_model3 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 47s - loss: 6.3352 - acc: 0.0000e+ - ETA: 16s - loss: 6.9926 - acc: 0.0367 - ETA: 10s - loss: 6.2860 - acc: 0.08 - ETA: 7s - loss: 5.5282 - acc: 0.1400 - ETA: 6s - loss: 4.9303 - acc: 0.191 - ETA: 5s - loss: 4.4877 - acc: 0.235 - ETA: 4s - loss: 4.0897 - acc: 0.282 - ETA: 4s - loss: 3.7745 - acc: 0.320 - ETA: 3s - loss: 3.5077 - acc: 0.350 - ETA: 3s - loss: 3.3017 - acc: 0.376 - ETA: 3s - loss: 3.1473 - acc: 0.397 - ETA: 2s - loss: 2.9901 - acc: 0.418 - ETA: 2s - loss: 2.8561 - acc: 0.439 - ETA: 2s - loss: 2.7136 - acc: 0.460 - ETA: 2s - loss: 2.5986 - acc: 0.475 - ETA: 1s - loss: 2.5020 - acc: 0.490 - ETA: 1s - loss: 2.4246 - acc: 0.500 - ETA: 1s - loss: 2.3369 - acc: 0.513 - ETA: 1s - loss: 2.2577 - acc: 0.525 - ETA: 1s - loss: 2.1810 - acc: 0.539 - ETA: 1s - loss: 2.1238 - acc: 0.547 - ETA: 1s - loss: 2.0799 - acc: 0.553 - ETA: 1s - loss: 2.0364 - acc: 0.558 - ETA: 0s - loss: 1.9890 - acc: 0.565 - ETA: 0s - loss: 1.9412 - acc: 0.574 - ETA: 0s - loss: 1.9065 - acc: 0.579 - ETA: 0s - loss: 1.8659 - acc: 0.585 - ETA: 0s - loss: 1.8330 - acc: 0.590 - ETA: 0s - loss: 1.8033 - acc: 0.595 - ETA: 0s - loss: 1.7709 - acc: 0.600 - ETA: 0s - loss: 1.7405 - acc: 0.604 - ETA: 0s - loss: 1.7123 - acc: 0.610 - ETA: 0s - loss: 1.6833 - acc: 0.614 - 3s 480us/step - loss: 1.6569 - acc: 0.6190 - val_loss: 0.7594 - val_acc: 0.7557 Epoch 00001: val_loss improved from inf to 0.75939, saving model to saved_models/weights.best.Xception3.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 2s - loss: 0.7135 - acc: 0.760 - ETA: 2s - loss: 0.5593 - acc: 0.816 - ETA: 2s - loss: 0.5639 - acc: 0.824 - ETA: 2s - loss: 0.5763 - acc: 0.822 - ETA: 1s - loss: 0.5793 - acc: 0.816 - ETA: 1s - loss: 0.5926 - acc: 0.814 - ETA: 1s - loss: 0.5856 - acc: 0.816 - ETA: 1s - loss: 0.5746 - acc: 0.820 - ETA: 1s - loss: 0.5759 - acc: 0.818 - ETA: 1s - loss: 0.5743 - acc: 0.817 - ETA: 1s - loss: 0.5734 - acc: 0.817 - ETA: 1s - loss: 0.5637 - acc: 0.821 - ETA: 1s - loss: 0.5635 - acc: 0.822 - ETA: 1s - loss: 0.5800 - acc: 0.818 - ETA: 1s - loss: 0.5747 - acc: 0.819 - ETA: 1s - loss: 0.5701 - acc: 0.821 - ETA: 1s - loss: 0.5691 - acc: 0.821 - ETA: 1s - loss: 0.5737 - acc: 0.820 - ETA: 1s - loss: 0.5671 - acc: 0.822 - ETA: 0s - loss: 0.5639 - acc: 0.823 - ETA: 0s - loss: 0.5637 - acc: 0.823 - ETA: 0s - loss: 0.5610 - acc: 0.824 - ETA: 0s - loss: 0.5552 - acc: 0.825 - ETA: 0s - loss: 0.5544 - acc: 0.826 - ETA: 0s - loss: 0.5523 - acc: 0.826 - ETA: 0s - loss: 0.5460 - acc: 0.827 - ETA: 0s - loss: 0.5457 - acc: 0.827 - ETA: 0s - loss: 0.5476 - acc: 0.827 - ETA: 0s - loss: 0.5517 - acc: 0.826 - ETA: 0s - loss: 0.5555 - acc: 0.825 - ETA: 0s - loss: 0.5568 - acc: 0.826 - ETA: 0s - loss: 0.5527 - acc: 0.826 - ETA: 0s - loss: 0.5584 - acc: 0.825 - 3s 380us/step - loss: 0.5587 - acc: 0.8254 - val_loss: 0.7107 - val_acc: 0.7844 Epoch 00002: val_loss improved from 0.75939 to 0.71075, saving model to saved_models/weights.best.Xception3.hdf5 Epoch 3/20 6680/6680 [==============================] - ETA: 2s - loss: 0.3526 - acc: 0.860 - ETA: 2s - loss: 0.3324 - acc: 0.876 - ETA: 2s - loss: 0.3297 - acc: 0.888 - ETA: 2s - loss: 0.3337 - acc: 0.892 - ETA: 1s - loss: 0.3140 - acc: 0.896 - ETA: 1s - loss: 0.3301 - acc: 0.890 - ETA: 1s - loss: 0.3522 - acc: 0.884 - ETA: 1s - loss: 0.3519 - acc: 0.884 - ETA: 1s - loss: 0.3607 - acc: 0.882 - ETA: 1s - loss: 0.3627 - acc: 0.882 - ETA: 1s - loss: 0.3679 - acc: 0.881 - ETA: 1s - loss: 0.3642 - acc: 0.881 - ETA: 1s - loss: 0.3697 - acc: 0.882 - ETA: 1s - loss: 0.3662 - acc: 0.882 - ETA: 1s - loss: 0.3807 - acc: 0.879 - ETA: 1s - loss: 0.3743 - acc: 0.880 - ETA: 1s - loss: 0.3682 - acc: 0.883 - ETA: 1s - loss: 0.3702 - acc: 0.882 - ETA: 0s - loss: 0.3733 - acc: 0.881 - ETA: 0s - loss: 0.3780 - acc: 0.881 - ETA: 0s - loss: 0.3785 - acc: 0.881 - ETA: 0s - loss: 0.3755 - acc: 0.881 - ETA: 0s - loss: 0.3741 - acc: 0.880 - ETA: 0s - loss: 0.3709 - acc: 0.881 - ETA: 0s - loss: 0.3686 - acc: 0.882 - ETA: 0s - loss: 0.3712 - acc: 0.881 - ETA: 0s - loss: 0.3724 - acc: 0.881 - ETA: 0s - loss: 0.3718 - acc: 0.881 - ETA: 0s - loss: 0.3703 - acc: 0.881 - ETA: 0s - loss: 0.3719 - acc: 0.881 - ETA: 0s - loss: 0.3707 - acc: 0.881 - ETA: 0s - loss: 0.3709 - acc: 0.881 - ETA: 0s - loss: 0.3731 - acc: 0.880 - 2s 368us/step - loss: 0.3758 - acc: 0.8796 - val_loss: 0.6763 - val_acc: 0.7940 Epoch 00003: val_loss improved from 0.71075 to 0.67629, saving model to saved_models/weights.best.Xception3.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 2s - loss: 0.1695 - acc: 0.950 - ETA: 2s - loss: 0.2304 - acc: 0.910 - ETA: 2s - loss: 0.2318 - acc: 0.908 - ETA: 1s - loss: 0.2342 - acc: 0.908 - ETA: 1s - loss: 0.2339 - acc: 0.914 - ETA: 1s - loss: 0.2243 - acc: 0.918 - ETA: 1s - loss: 0.2250 - acc: 0.920 - ETA: 1s - loss: 0.2150 - acc: 0.922 - ETA: 1s - loss: 0.2222 - acc: 0.920 - ETA: 1s - loss: 0.2184 - acc: 0.920 - ETA: 1s - loss: 0.2140 - acc: 0.923 - ETA: 1s - loss: 0.2169 - acc: 0.920 - ETA: 1s - loss: 0.2278 - acc: 0.916 - ETA: 1s - loss: 0.2357 - acc: 0.914 - ETA: 1s - loss: 0.2387 - acc: 0.913 - ETA: 1s - loss: 0.2486 - acc: 0.912 - ETA: 1s - loss: 0.2584 - acc: 0.908 - ETA: 1s - loss: 0.2614 - acc: 0.907 - ETA: 1s - loss: 0.2552 - acc: 0.909 - ETA: 0s - loss: 0.2565 - acc: 0.908 - ETA: 0s - loss: 0.2579 - acc: 0.907 - ETA: 0s - loss: 0.2603 - acc: 0.907 - ETA: 0s - loss: 0.2580 - acc: 0.907 - ETA: 0s - loss: 0.2598 - acc: 0.907 - ETA: 0s - loss: 0.2654 - acc: 0.904 - ETA: 0s - loss: 0.2636 - acc: 0.904 - ETA: 0s - loss: 0.2650 - acc: 0.904 - ETA: 0s - loss: 0.2664 - acc: 0.903 - ETA: 0s - loss: 0.2701 - acc: 0.904 - ETA: 0s - loss: 0.2701 - acc: 0.903 - ETA: 0s - loss: 0.2694 - acc: 0.903 - ETA: 0s - loss: 0.2695 - acc: 0.903 - ETA: 0s - loss: 0.2691 - acc: 0.904 - 2s 369us/step - loss: 0.2702 - acc: 0.9049 - val_loss: 0.5968 - val_acc: 0.8335 Epoch 00004: val_loss improved from 0.67629 to 0.59678, saving model to saved_models/weights.best.Xception3.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 2s - loss: 0.1035 - acc: 0.970 - ETA: 2s - loss: 0.1560 - acc: 0.956 - ETA: 2s - loss: 0.1490 - acc: 0.958 - ETA: 1s - loss: 0.1409 - acc: 0.961 - ETA: 1s - loss: 0.1483 - acc: 0.957 - ETA: 1s - loss: 0.1476 - acc: 0.958 - ETA: 1s - loss: 0.1430 - acc: 0.957 - ETA: 1s - loss: 0.1486 - acc: 0.956 - ETA: 1s - loss: 0.1538 - acc: 0.953 - ETA: 1s - loss: 0.1656 - acc: 0.946 - ETA: 1s - loss: 0.1669 - acc: 0.945 - ETA: 1s - loss: 0.1760 - acc: 0.942 - ETA: 1s - loss: 0.1777 - acc: 0.942 - ETA: 1s - loss: 0.1769 - acc: 0.941 - ETA: 1s - loss: 0.1790 - acc: 0.941 - ETA: 1s - loss: 0.1833 - acc: 0.939 - ETA: 1s - loss: 0.1809 - acc: 0.940 - ETA: 1s - loss: 0.1846 - acc: 0.939 - ETA: 0s - loss: 0.1849 - acc: 0.938 - ETA: 0s - loss: 0.1856 - acc: 0.939 - ETA: 0s - loss: 0.1861 - acc: 0.938 - ETA: 0s - loss: 0.1838 - acc: 0.938 - ETA: 0s - loss: 0.1834 - acc: 0.939 - ETA: 0s - loss: 0.1843 - acc: 0.938 - ETA: 0s - loss: 0.1837 - acc: 0.938 - ETA: 0s - loss: 0.1852 - acc: 0.938 - ETA: 0s - loss: 0.1843 - acc: 0.939 - ETA: 0s - loss: 0.1883 - acc: 0.938 - ETA: 0s - loss: 0.1890 - acc: 0.938 - ETA: 0s - loss: 0.1883 - acc: 0.938 - ETA: 0s - loss: 0.1899 - acc: 0.937 - ETA: 0s - loss: 0.1888 - acc: 0.937 - ETA: 0s - loss: 0.1900 - acc: 0.936 - 2s 366us/step - loss: 0.1899 - acc: 0.9365 - val_loss: 0.6909 - val_acc: 0.8204 Epoch 00005: val_loss did not improve Epoch 6/20 6680/6680 [==============================] - ETA: 2s - loss: 0.1818 - acc: 0.930 - ETA: 2s - loss: 0.1562 - acc: 0.946 - ETA: 2s - loss: 0.1296 - acc: 0.952 - ETA: 2s - loss: 0.1263 - acc: 0.950 - ETA: 1s - loss: 0.1186 - acc: 0.953 - ETA: 1s - loss: 0.1148 - acc: 0.956 - ETA: 1s - loss: 0.1132 - acc: 0.957 - ETA: 1s - loss: 0.1172 - acc: 0.956 - ETA: 1s - loss: 0.1162 - acc: 0.957 - ETA: 1s - loss: 0.1122 - acc: 0.959 - ETA: 1s - loss: 0.1125 - acc: 0.959 - ETA: 1s - loss: 0.1127 - acc: 0.960 - ETA: 1s - loss: 0.1185 - acc: 0.959 - ETA: 1s - loss: 0.1198 - acc: 0.959 - ETA: 1s - loss: 0.1177 - acc: 0.960 - ETA: 1s - loss: 0.1189 - acc: 0.960 - ETA: 1s - loss: 0.1224 - acc: 0.959 - ETA: 1s - loss: 0.1290 - acc: 0.958 - ETA: 1s - loss: 0.1280 - acc: 0.958 - ETA: 0s - loss: 0.1283 - acc: 0.958 - ETA: 0s - loss: 0.1311 - acc: 0.957 - ETA: 0s - loss: 0.1338 - acc: 0.956 - ETA: 0s - loss: 0.1334 - acc: 0.956 - ETA: 0s - loss: 0.1322 - acc: 0.956 - ETA: 0s - loss: 0.1323 - acc: 0.956 - ETA: 0s - loss: 0.1340 - acc: 0.954 - ETA: 0s - loss: 0.1389 - acc: 0.953 - ETA: 0s - loss: 0.1381 - acc: 0.953 - ETA: 0s - loss: 0.1423 - acc: 0.951 - ETA: 0s - loss: 0.1464 - acc: 0.951 - ETA: 0s - loss: 0.1468 - acc: 0.950 - ETA: 0s - loss: 0.1479 - acc: 0.950 - ETA: 0s - loss: 0.1500 - acc: 0.949 - 2s 374us/step - loss: 0.1494 - acc: 0.9493 - val_loss: 0.7097 - val_acc: 0.8048 Epoch 00006: val_loss did not improve Epoch 7/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0687 - acc: 0.980 - ETA: 2s - loss: 0.0765 - acc: 0.980 - ETA: 2s - loss: 0.0661 - acc: 0.986 - ETA: 2s - loss: 0.0629 - acc: 0.987 - ETA: 1s - loss: 0.0751 - acc: 0.983 - ETA: 1s - loss: 0.0780 - acc: 0.984 - ETA: 1s - loss: 0.0786 - acc: 0.981 - ETA: 1s - loss: 0.0749 - acc: 0.982 - ETA: 1s - loss: 0.0747 - acc: 0.981 - ETA: 1s - loss: 0.0770 - acc: 0.980 - ETA: 1s - loss: 0.0788 - acc: 0.978 - ETA: 1s - loss: 0.0799 - acc: 0.976 - ETA: 1s - loss: 0.0800 - acc: 0.975 - ETA: 1s - loss: 0.0798 - acc: 0.975 - ETA: 1s - loss: 0.0805 - acc: 0.975 - ETA: 1s - loss: 0.0806 - acc: 0.975 - ETA: 1s - loss: 0.0811 - acc: 0.974 - ETA: 1s - loss: 0.0828 - acc: 0.973 - ETA: 0s - loss: 0.0850 - acc: 0.973 - ETA: 0s - loss: 0.0878 - acc: 0.972 - ETA: 0s - loss: 0.0894 - acc: 0.971 - ETA: 0s - loss: 0.0928 - acc: 0.970 - ETA: 0s - loss: 0.0942 - acc: 0.969 - ETA: 0s - loss: 0.0956 - acc: 0.968 - ETA: 0s - loss: 0.0966 - acc: 0.968 - ETA: 0s - loss: 0.0976 - acc: 0.968 - ETA: 0s - loss: 0.1002 - acc: 0.967 - ETA: 0s - loss: 0.1006 - acc: 0.967 - ETA: 0s - loss: 0.1010 - acc: 0.967 - ETA: 0s - loss: 0.1023 - acc: 0.966 - ETA: 0s - loss: 0.1028 - acc: 0.965 - ETA: 0s - loss: 0.1023 - acc: 0.965 - ETA: 0s - loss: 0.1028 - acc: 0.965 - 2s 368us/step - loss: 0.1051 - acc: 0.9648 - val_loss: 0.7126 - val_acc: 0.8144 Epoch 00007: val_loss did not improve Epoch 8/20 6680/6680 [==============================] - ETA: 2s - loss: 0.1413 - acc: 0.940 - ETA: 2s - loss: 0.0967 - acc: 0.966 - ETA: 2s - loss: 0.0955 - acc: 0.974 - ETA: 1s - loss: 0.0940 - acc: 0.971 - ETA: 1s - loss: 0.0828 - acc: 0.975 - ETA: 1s - loss: 0.0726 - acc: 0.979 - ETA: 1s - loss: 0.0676 - acc: 0.981 - ETA: 1s - loss: 0.0648 - acc: 0.982 - ETA: 1s - loss: 0.0651 - acc: 0.981 - ETA: 1s - loss: 0.0665 - acc: 0.981 - ETA: 1s - loss: 0.0675 - acc: 0.980 - ETA: 1s - loss: 0.0681 - acc: 0.979 - ETA: 1s - loss: 0.0680 - acc: 0.979 - ETA: 1s - loss: 0.0716 - acc: 0.977 - ETA: 1s - loss: 0.0734 - acc: 0.977 - ETA: 1s - loss: 0.0728 - acc: 0.977 - ETA: 1s - loss: 0.0728 - acc: 0.977 - ETA: 1s - loss: 0.0761 - acc: 0.977 - ETA: 0s - loss: 0.0779 - acc: 0.977 - ETA: 0s - loss: 0.0773 - acc: 0.977 - ETA: 0s - loss: 0.0769 - acc: 0.977 - ETA: 0s - loss: 0.0780 - acc: 0.976 - ETA: 0s - loss: 0.0778 - acc: 0.976 - ETA: 0s - loss: 0.0779 - acc: 0.976 - ETA: 0s - loss: 0.0793 - acc: 0.976 - ETA: 0s - loss: 0.0795 - acc: 0.975 - ETA: 0s - loss: 0.0792 - acc: 0.975 - ETA: 0s - loss: 0.0783 - acc: 0.976 - ETA: 0s - loss: 0.0824 - acc: 0.974 - ETA: 0s - loss: 0.0842 - acc: 0.973 - ETA: 0s - loss: 0.0843 - acc: 0.973 - ETA: 0s - loss: 0.0837 - acc: 0.974 - ETA: 0s - loss: 0.0845 - acc: 0.973 - 2s 366us/step - loss: 0.0854 - acc: 0.9734 - val_loss: 0.6275 - val_acc: 0.8359 Epoch 00008: val_loss did not improve Epoch 9/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0446 - acc: 1.000 - ETA: 2s - loss: 0.0512 - acc: 0.993 - ETA: 2s - loss: 0.0434 - acc: 0.994 - ETA: 1s - loss: 0.0418 - acc: 0.995 - ETA: 1s - loss: 0.0424 - acc: 0.994 - ETA: 1s - loss: 0.0437 - acc: 0.991 - ETA: 1s - loss: 0.0440 - acc: 0.990 - ETA: 1s - loss: 0.0448 - acc: 0.988 - ETA: 1s - loss: 0.0428 - acc: 0.990 - ETA: 1s - loss: 0.0424 - acc: 0.988 - ETA: 1s - loss: 0.0424 - acc: 0.989 - ETA: 1s - loss: 0.0457 - acc: 0.987 - ETA: 1s - loss: 0.0467 - acc: 0.987 - ETA: 1s - loss: 0.0487 - acc: 0.987 - ETA: 1s - loss: 0.0522 - acc: 0.987 - ETA: 1s - loss: 0.0523 - acc: 0.986 - ETA: 1s - loss: 0.0546 - acc: 0.984 - ETA: 1s - loss: 0.0553 - acc: 0.984 - ETA: 0s - loss: 0.0557 - acc: 0.984 - ETA: 0s - loss: 0.0569 - acc: 0.983 - ETA: 0s - loss: 0.0570 - acc: 0.983 - ETA: 0s - loss: 0.0573 - acc: 0.982 - ETA: 0s - loss: 0.0590 - acc: 0.981 - ETA: 0s - loss: 0.0583 - acc: 0.982 - ETA: 0s - loss: 0.0581 - acc: 0.982 - ETA: 0s - loss: 0.0588 - acc: 0.982 - ETA: 0s - loss: 0.0591 - acc: 0.981 - ETA: 0s - loss: 0.0616 - acc: 0.980 - ETA: 0s - loss: 0.0609 - acc: 0.981 - ETA: 0s - loss: 0.0635 - acc: 0.980 - ETA: 0s - loss: 0.0656 - acc: 0.979 - ETA: 0s - loss: 0.0672 - acc: 0.979 - ETA: 0s - loss: 0.0672 - acc: 0.979 - 2s 368us/step - loss: 0.0674 - acc: 0.9789 - val_loss: 0.6148 - val_acc: 0.8359 Epoch 00009: val_loss did not improve Epoch 10/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0388 - acc: 1.000 - ETA: 2s - loss: 0.0646 - acc: 0.980 - ETA: 2s - loss: 0.0479 - acc: 0.988 - ETA: 2s - loss: 0.0411 - acc: 0.990 - ETA: 1s - loss: 0.0375 - acc: 0.990 - ETA: 1s - loss: 0.0364 - acc: 0.990 - ETA: 1s - loss: 0.0398 - acc: 0.989 - ETA: 1s - loss: 0.0408 - acc: 0.988 - ETA: 1s - loss: 0.0434 - acc: 0.988 - ETA: 1s - loss: 0.0438 - acc: 0.988 - ETA: 1s - loss: 0.0424 - acc: 0.988 - ETA: 1s - loss: 0.0414 - acc: 0.989 - ETA: 1s - loss: 0.0425 - acc: 0.988 - ETA: 1s - loss: 0.0436 - acc: 0.988 - ETA: 1s - loss: 0.0448 - acc: 0.989 - ETA: 1s - loss: 0.0441 - acc: 0.988 - ETA: 1s - loss: 0.0435 - acc: 0.988 - ETA: 1s - loss: 0.0457 - acc: 0.988 - ETA: 0s - loss: 0.0457 - acc: 0.987 - ETA: 0s - loss: 0.0456 - acc: 0.987 - ETA: 0s - loss: 0.0474 - acc: 0.986 - ETA: 0s - loss: 0.0498 - acc: 0.985 - ETA: 0s - loss: 0.0508 - acc: 0.985 - ETA: 0s - loss: 0.0511 - acc: 0.984 - ETA: 0s - loss: 0.0516 - acc: 0.984 - ETA: 0s - loss: 0.0513 - acc: 0.984 - ETA: 0s - loss: 0.0515 - acc: 0.984 - ETA: 0s - loss: 0.0519 - acc: 0.984 - ETA: 0s - loss: 0.0519 - acc: 0.984 - ETA: 0s - loss: 0.0513 - acc: 0.984 - ETA: 0s - loss: 0.0507 - acc: 0.984 - ETA: 0s - loss: 0.0501 - acc: 0.985 - ETA: 0s - loss: 0.0490 - acc: 0.985 - 2s 371us/step - loss: 0.0487 - acc: 0.9858 - val_loss: 0.6878 - val_acc: 0.8395 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0448 - acc: 0.980 - ETA: 2s - loss: 0.0251 - acc: 0.993 - ETA: 2s - loss: 0.0237 - acc: 0.994 - ETA: 1s - loss: 0.0255 - acc: 0.992 - ETA: 1s - loss: 0.0260 - acc: 0.993 - ETA: 1s - loss: 0.0303 - acc: 0.990 - ETA: 1s - loss: 0.0280 - acc: 0.992 - ETA: 1s - loss: 0.0272 - acc: 0.992 - ETA: 1s - loss: 0.0283 - acc: 0.992 - ETA: 1s - loss: 0.0293 - acc: 0.991 - ETA: 1s - loss: 0.0294 - acc: 0.991 - ETA: 1s - loss: 0.0284 - acc: 0.991 - ETA: 1s - loss: 0.0302 - acc: 0.991 - ETA: 1s - loss: 0.0345 - acc: 0.989 - ETA: 1s - loss: 0.0386 - acc: 0.989 - ETA: 1s - loss: 0.0382 - acc: 0.989 - ETA: 1s - loss: 0.0395 - acc: 0.989 - ETA: 1s - loss: 0.0406 - acc: 0.990 - ETA: 0s - loss: 0.0402 - acc: 0.990 - ETA: 0s - loss: 0.0388 - acc: 0.990 - ETA: 0s - loss: 0.0378 - acc: 0.991 - ETA: 0s - loss: 0.0374 - acc: 0.990 - ETA: 0s - loss: 0.0379 - acc: 0.990 - ETA: 0s - loss: 0.0394 - acc: 0.989 - ETA: 0s - loss: 0.0400 - acc: 0.989 - ETA: 0s - loss: 0.0396 - acc: 0.989 - ETA: 0s - loss: 0.0403 - acc: 0.989 - ETA: 0s - loss: 0.0410 - acc: 0.989 - ETA: 0s - loss: 0.0406 - acc: 0.989 - ETA: 0s - loss: 0.0411 - acc: 0.989 - ETA: 0s - loss: 0.0420 - acc: 0.988 - ETA: 0s - loss: 0.0421 - acc: 0.988 - ETA: 0s - loss: 0.0426 - acc: 0.988 - 2s 370us/step - loss: 0.0448 - acc: 0.9882 - val_loss: 0.7090 - val_acc: 0.8299 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0240 - acc: 0.990 - ETA: 2s - loss: 0.0183 - acc: 0.993 - ETA: 2s - loss: 0.0186 - acc: 0.996 - ETA: 2s - loss: 0.0181 - acc: 0.997 - ETA: 2s - loss: 0.0187 - acc: 0.996 - ETA: 1s - loss: 0.0198 - acc: 0.995 - ETA: 1s - loss: 0.0202 - acc: 0.994 - ETA: 1s - loss: 0.0195 - acc: 0.995 - ETA: 1s - loss: 0.0218 - acc: 0.995 - ETA: 1s - loss: 0.0219 - acc: 0.994 - ETA: 1s - loss: 0.0239 - acc: 0.993 - ETA: 1s - loss: 0.0272 - acc: 0.992 - ETA: 1s - loss: 0.0275 - acc: 0.992 - ETA: 1s - loss: 0.0298 - acc: 0.991 - ETA: 1s - loss: 0.0287 - acc: 0.991 - ETA: 1s - loss: 0.0283 - acc: 0.991 - ETA: 1s - loss: 0.0284 - acc: 0.992 - ETA: 1s - loss: 0.0273 - acc: 0.992 - ETA: 1s - loss: 0.0288 - acc: 0.992 - ETA: 0s - loss: 0.0295 - acc: 0.992 - ETA: 0s - loss: 0.0287 - acc: 0.992 - ETA: 0s - loss: 0.0296 - acc: 0.991 - ETA: 0s - loss: 0.0315 - acc: 0.991 - ETA: 0s - loss: 0.0346 - acc: 0.990 - ETA: 0s - loss: 0.0355 - acc: 0.989 - ETA: 0s - loss: 0.0359 - acc: 0.989 - ETA: 0s - loss: 0.0350 - acc: 0.989 - ETA: 0s - loss: 0.0350 - acc: 0.989 - ETA: 0s - loss: 0.0350 - acc: 0.989 - ETA: 0s - loss: 0.0344 - acc: 0.989 - ETA: 0s - loss: 0.0345 - acc: 0.989 - ETA: 0s - loss: 0.0355 - acc: 0.989 - ETA: 0s - loss: 0.0351 - acc: 0.989 - 3s 376us/step - loss: 0.0350 - acc: 0.9898 - val_loss: 0.6791 - val_acc: 0.8431 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0273 - acc: 0.980 - ETA: 2s - loss: 0.0233 - acc: 0.990 - ETA: 2s - loss: 0.0221 - acc: 0.992 - ETA: 2s - loss: 0.0257 - acc: 0.991 - ETA: 1s - loss: 0.0254 - acc: 0.991 - ETA: 1s - loss: 0.0224 - acc: 0.992 - ETA: 1s - loss: 0.0231 - acc: 0.992 - ETA: 1s - loss: 0.0215 - acc: 0.993 - ETA: 1s - loss: 0.0209 - acc: 0.994 - ETA: 1s - loss: 0.0202 - acc: 0.994 - ETA: 1s - loss: 0.0206 - acc: 0.994 - ETA: 1s - loss: 0.0209 - acc: 0.993 - ETA: 1s - loss: 0.0205 - acc: 0.994 - ETA: 1s - loss: 0.0212 - acc: 0.994 - ETA: 1s - loss: 0.0210 - acc: 0.994 - ETA: 1s - loss: 0.0206 - acc: 0.994 - ETA: 1s - loss: 0.0219 - acc: 0.994 - ETA: 1s - loss: 0.0213 - acc: 0.994 - ETA: 1s - loss: 0.0211 - acc: 0.994 - ETA: 0s - loss: 0.0211 - acc: 0.994 - ETA: 0s - loss: 0.0219 - acc: 0.993 - ETA: 0s - loss: 0.0249 - acc: 0.992 - ETA: 0s - loss: 0.0255 - acc: 0.992 - ETA: 0s - loss: 0.0265 - acc: 0.993 - ETA: 0s - loss: 0.0281 - acc: 0.992 - ETA: 0s - loss: 0.0278 - acc: 0.992 - ETA: 0s - loss: 0.0275 - acc: 0.992 - ETA: 0s - loss: 0.0284 - acc: 0.992 - ETA: 0s - loss: 0.0291 - acc: 0.991 - ETA: 0s - loss: 0.0286 - acc: 0.992 - ETA: 0s - loss: 0.0284 - acc: 0.992 - ETA: 0s - loss: 0.0284 - acc: 0.991 - ETA: 0s - loss: 0.0280 - acc: 0.992 - 2s 371us/step - loss: 0.0296 - acc: 0.9918 - val_loss: 0.7305 - val_acc: 0.8347 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0196 - acc: 0.990 - ETA: 2s - loss: 0.0210 - acc: 0.993 - ETA: 2s - loss: 0.0162 - acc: 0.996 - ETA: 2s - loss: 0.0130 - acc: 0.997 - ETA: 1s - loss: 0.0113 - acc: 0.997 - ETA: 1s - loss: 0.0112 - acc: 0.998 - ETA: 1s - loss: 0.0139 - acc: 0.996 - ETA: 1s - loss: 0.0140 - acc: 0.996 - ETA: 1s - loss: 0.0131 - acc: 0.997 - ETA: 1s - loss: 0.0124 - acc: 0.997 - ETA: 1s - loss: 0.0154 - acc: 0.996 - ETA: 1s - loss: 0.0172 - acc: 0.995 - ETA: 1s - loss: 0.0172 - acc: 0.994 - ETA: 1s - loss: 0.0170 - acc: 0.995 - ETA: 1s - loss: 0.0170 - acc: 0.994 - ETA: 1s - loss: 0.0170 - acc: 0.995 - ETA: 1s - loss: 0.0183 - acc: 0.994 - ETA: 1s - loss: 0.0204 - acc: 0.994 - ETA: 0s - loss: 0.0216 - acc: 0.993 - ETA: 0s - loss: 0.0213 - acc: 0.993 - ETA: 0s - loss: 0.0215 - acc: 0.993 - ETA: 0s - loss: 0.0223 - acc: 0.993 - ETA: 0s - loss: 0.0227 - acc: 0.992 - ETA: 0s - loss: 0.0221 - acc: 0.993 - ETA: 0s - loss: 0.0216 - acc: 0.993 - ETA: 0s - loss: 0.0213 - acc: 0.993 - ETA: 0s - loss: 0.0223 - acc: 0.993 - ETA: 0s - loss: 0.0222 - acc: 0.993 - ETA: 0s - loss: 0.0220 - acc: 0.993 - ETA: 0s - loss: 0.0218 - acc: 0.993 - ETA: 0s - loss: 0.0219 - acc: 0.993 - ETA: 0s - loss: 0.0231 - acc: 0.993 - ETA: 0s - loss: 0.0230 - acc: 0.993 - 2s 369us/step - loss: 0.0239 - acc: 0.9928 - val_loss: 0.7486 - val_acc: 0.8443 Epoch 00014: val_loss did not improve Epoch 15/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0135 - acc: 1.000 - ETA: 2s - loss: 0.0154 - acc: 0.996 - ETA: 2s - loss: 0.0186 - acc: 0.996 - ETA: 1s - loss: 0.0220 - acc: 0.994 - ETA: 1s - loss: 0.0244 - acc: 0.992 - ETA: 1s - loss: 0.0249 - acc: 0.990 - ETA: 1s - loss: 0.0233 - acc: 0.991 - ETA: 1s - loss: 0.0232 - acc: 0.991 - ETA: 1s - loss: 0.0221 - acc: 0.992 - ETA: 1s - loss: 0.0218 - acc: 0.992 - ETA: 1s - loss: 0.0206 - acc: 0.993 - ETA: 1s - loss: 0.0194 - acc: 0.993 - ETA: 1s - loss: 0.0199 - acc: 0.994 - ETA: 1s - loss: 0.0212 - acc: 0.993 - ETA: 1s - loss: 0.0201 - acc: 0.994 - ETA: 1s - loss: 0.0191 - acc: 0.994 - ETA: 1s - loss: 0.0185 - acc: 0.994 - ETA: 1s - loss: 0.0182 - acc: 0.995 - ETA: 0s - loss: 0.0180 - acc: 0.995 - ETA: 0s - loss: 0.0179 - acc: 0.995 - ETA: 0s - loss: 0.0175 - acc: 0.995 - ETA: 0s - loss: 0.0175 - acc: 0.995 - ETA: 0s - loss: 0.0172 - acc: 0.995 - ETA: 0s - loss: 0.0172 - acc: 0.995 - ETA: 0s - loss: 0.0169 - acc: 0.995 - ETA: 0s - loss: 0.0165 - acc: 0.996 - ETA: 0s - loss: 0.0171 - acc: 0.996 - ETA: 0s - loss: 0.0173 - acc: 0.995 - ETA: 0s - loss: 0.0179 - acc: 0.995 - ETA: 0s - loss: 0.0182 - acc: 0.995 - ETA: 0s - loss: 0.0180 - acc: 0.995 - ETA: 0s - loss: 0.0195 - acc: 0.995 - ETA: 0s - loss: 0.0194 - acc: 0.995 - 2s 369us/step - loss: 0.0192 - acc: 0.9954 - val_loss: 0.7141 - val_acc: 0.8371 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0368 - acc: 0.980 - ETA: 2s - loss: 0.0299 - acc: 0.983 - ETA: 2s - loss: 0.0219 - acc: 0.990 - ETA: 2s - loss: 0.0181 - acc: 0.992 - ETA: 1s - loss: 0.0154 - acc: 0.994 - ETA: 1s - loss: 0.0143 - acc: 0.994 - ETA: 1s - loss: 0.0128 - acc: 0.995 - ETA: 1s - loss: 0.0117 - acc: 0.996 - ETA: 1s - loss: 0.0107 - acc: 0.996 - ETA: 1s - loss: 0.0098 - acc: 0.996 - ETA: 1s - loss: 0.0095 - acc: 0.997 - ETA: 1s - loss: 0.0098 - acc: 0.997 - ETA: 1s - loss: 0.0099 - acc: 0.997 - ETA: 1s - loss: 0.0108 - acc: 0.997 - ETA: 1s - loss: 0.0113 - acc: 0.996 - ETA: 1s - loss: 0.0113 - acc: 0.997 - ETA: 1s - loss: 0.0108 - acc: 0.997 - ETA: 1s - loss: 0.0105 - acc: 0.997 - ETA: 1s - loss: 0.0108 - acc: 0.997 - ETA: 0s - loss: 0.0117 - acc: 0.996 - ETA: 0s - loss: 0.0119 - acc: 0.996 - ETA: 0s - loss: 0.0136 - acc: 0.996 - ETA: 0s - loss: 0.0149 - acc: 0.996 - ETA: 0s - loss: 0.0153 - acc: 0.996 - ETA: 0s - loss: 0.0158 - acc: 0.995 - ETA: 0s - loss: 0.0166 - acc: 0.995 - ETA: 0s - loss: 0.0163 - acc: 0.996 - ETA: 0s - loss: 0.0170 - acc: 0.995 - ETA: 0s - loss: 0.0179 - acc: 0.994 - ETA: 0s - loss: 0.0180 - acc: 0.994 - ETA: 0s - loss: 0.0180 - acc: 0.994 - ETA: 0s - loss: 0.0188 - acc: 0.994 - ETA: 0s - loss: 0.0190 - acc: 0.994 - 3s 377us/step - loss: 0.0187 - acc: 0.9945 - val_loss: 0.7492 - val_acc: 0.8443 Epoch 00016: val_loss did not improve Epoch 17/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0033 - acc: 1.000 - ETA: 2s - loss: 0.0092 - acc: 1.000 - ETA: 2s - loss: 0.0065 - acc: 1.000 - ETA: 1s - loss: 0.0107 - acc: 0.998 - ETA: 1s - loss: 0.0116 - acc: 0.996 - ETA: 1s - loss: 0.0099 - acc: 0.997 - ETA: 1s - loss: 0.0095 - acc: 0.996 - ETA: 1s - loss: 0.0093 - acc: 0.996 - ETA: 1s - loss: 0.0107 - acc: 0.995 - ETA: 1s - loss: 0.0115 - acc: 0.995 - ETA: 1s - loss: 0.0119 - acc: 0.995 - ETA: 1s - loss: 0.0118 - acc: 0.995 - ETA: 1s - loss: 0.0129 - acc: 0.995 - ETA: 1s - loss: 0.0138 - acc: 0.994 - ETA: 1s - loss: 0.0131 - acc: 0.995 - ETA: 1s - loss: 0.0137 - acc: 0.995 - ETA: 1s - loss: 0.0134 - acc: 0.995 - ETA: 1s - loss: 0.0130 - acc: 0.995 - ETA: 0s - loss: 0.0138 - acc: 0.995 - ETA: 0s - loss: 0.0135 - acc: 0.995 - ETA: 0s - loss: 0.0134 - acc: 0.996 - ETA: 0s - loss: 0.0136 - acc: 0.996 - ETA: 0s - loss: 0.0134 - acc: 0.996 - ETA: 0s - loss: 0.0135 - acc: 0.996 - ETA: 0s - loss: 0.0146 - acc: 0.995 - ETA: 0s - loss: 0.0143 - acc: 0.995 - ETA: 0s - loss: 0.0141 - acc: 0.995 - ETA: 0s - loss: 0.0139 - acc: 0.995 - ETA: 0s - loss: 0.0145 - acc: 0.995 - ETA: 0s - loss: 0.0142 - acc: 0.995 - ETA: 0s - loss: 0.0141 - acc: 0.995 - ETA: 0s - loss: 0.0139 - acc: 0.995 - ETA: 0s - loss: 0.0149 - acc: 0.995 - 2s 369us/step - loss: 0.0154 - acc: 0.9952 - val_loss: 0.7755 - val_acc: 0.8491 Epoch 00017: val_loss did not improve Epoch 18/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0044 - acc: 1.000 - ETA: 2s - loss: 0.0055 - acc: 1.000 - ETA: 2s - loss: 0.0100 - acc: 0.996 - ETA: 2s - loss: 0.0084 - acc: 0.997 - ETA: 1s - loss: 0.0084 - acc: 0.997 - ETA: 1s - loss: 0.0148 - acc: 0.997 - ETA: 1s - loss: 0.0128 - acc: 0.997 - ETA: 1s - loss: 0.0122 - acc: 0.998 - ETA: 1s - loss: 0.0114 - acc: 0.998 - ETA: 1s - loss: 0.0107 - acc: 0.998 - ETA: 1s - loss: 0.0100 - acc: 0.998 - ETA: 1s - loss: 0.0093 - acc: 0.998 - ETA: 1s - loss: 0.0098 - acc: 0.998 - ETA: 1s - loss: 0.0095 - acc: 0.998 - ETA: 1s - loss: 0.0115 - acc: 0.997 - ETA: 1s - loss: 0.0110 - acc: 0.998 - ETA: 1s - loss: 0.0126 - acc: 0.997 - ETA: 1s - loss: 0.0121 - acc: 0.997 - ETA: 1s - loss: 0.0119 - acc: 0.997 - ETA: 0s - loss: 0.0118 - acc: 0.997 - ETA: 0s - loss: 0.0114 - acc: 0.997 - ETA: 0s - loss: 0.0112 - acc: 0.997 - ETA: 0s - loss: 0.0117 - acc: 0.997 - ETA: 0s - loss: 0.0117 - acc: 0.997 - ETA: 0s - loss: 0.0115 - acc: 0.997 - ETA: 0s - loss: 0.0114 - acc: 0.997 - ETA: 0s - loss: 0.0119 - acc: 0.997 - ETA: 0s - loss: 0.0117 - acc: 0.997 - ETA: 0s - loss: 0.0116 - acc: 0.997 - ETA: 0s - loss: 0.0124 - acc: 0.996 - ETA: 0s - loss: 0.0122 - acc: 0.996 - ETA: 0s - loss: 0.0119 - acc: 0.997 - ETA: 0s - loss: 0.0117 - acc: 0.997 - 3s 376us/step - loss: 0.0124 - acc: 0.9969 - val_loss: 0.7928 - val_acc: 0.8419 Epoch 00018: val_loss did not improve Epoch 19/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0025 - acc: 1.000 - ETA: 2s - loss: 0.0060 - acc: 1.000 - ETA: 2s - loss: 0.0061 - acc: 1.000 - ETA: 1s - loss: 0.0053 - acc: 1.000 - ETA: 1s - loss: 0.0054 - acc: 1.000 - ETA: 1s - loss: 0.0060 - acc: 0.999 - ETA: 1s - loss: 0.0059 - acc: 0.999 - ETA: 1s - loss: 0.0065 - acc: 0.998 - ETA: 1s - loss: 0.0072 - acc: 0.998 - ETA: 1s - loss: 0.0069 - acc: 0.998 - ETA: 1s - loss: 0.0067 - acc: 0.998 - ETA: 1s - loss: 0.0070 - acc: 0.998 - ETA: 1s - loss: 0.0070 - acc: 0.998 - ETA: 1s - loss: 0.0069 - acc: 0.998 - ETA: 1s - loss: 0.0072 - acc: 0.998 - ETA: 1s - loss: 0.0072 - acc: 0.998 - ETA: 1s - loss: 0.0072 - acc: 0.998 - ETA: 1s - loss: 0.0070 - acc: 0.998 - ETA: 1s - loss: 0.0069 - acc: 0.998 - ETA: 0s - loss: 0.0069 - acc: 0.998 - ETA: 0s - loss: 0.0073 - acc: 0.998 - ETA: 0s - loss: 0.0072 - acc: 0.998 - ETA: 0s - loss: 0.0071 - acc: 0.998 - ETA: 0s - loss: 0.0073 - acc: 0.998 - ETA: 0s - loss: 0.0071 - acc: 0.998 - ETA: 0s - loss: 0.0070 - acc: 0.998 - ETA: 0s - loss: 0.0070 - acc: 0.998 - ETA: 0s - loss: 0.0070 - acc: 0.998 - ETA: 0s - loss: 0.0069 - acc: 0.998 - ETA: 0s - loss: 0.0070 - acc: 0.998 - ETA: 0s - loss: 0.0072 - acc: 0.998 - ETA: 0s - loss: 0.0105 - acc: 0.997 - ETA: 0s - loss: 0.0108 - acc: 0.997 - 2s 372us/step - loss: 0.0106 - acc: 0.9973 - val_loss: 0.7549 - val_acc: 0.8467 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 2s - loss: 0.0029 - acc: 1.000 - ETA: 2s - loss: 0.0043 - acc: 1.000 - ETA: 2s - loss: 0.0112 - acc: 0.996 - ETA: 2s - loss: 0.0092 - acc: 0.997 - ETA: 1s - loss: 0.0078 - acc: 0.997 - ETA: 1s - loss: 0.0067 - acc: 0.998 - ETA: 1s - loss: 0.0086 - acc: 0.997 - ETA: 1s - loss: 0.0108 - acc: 0.997 - ETA: 1s - loss: 0.0101 - acc: 0.997 - ETA: 1s - loss: 0.0091 - acc: 0.997 - ETA: 1s - loss: 0.0088 - acc: 0.998 - ETA: 1s - loss: 0.0082 - acc: 0.998 - ETA: 1s - loss: 0.0098 - acc: 0.997 - ETA: 1s - loss: 0.0105 - acc: 0.997 - ETA: 1s - loss: 0.0113 - acc: 0.996 - ETA: 1s - loss: 0.0124 - acc: 0.996 - ETA: 1s - loss: 0.0123 - acc: 0.996 - ETA: 1s - loss: 0.0120 - acc: 0.996 - ETA: 0s - loss: 0.0119 - acc: 0.996 - ETA: 0s - loss: 0.0116 - acc: 0.996 - ETA: 0s - loss: 0.0117 - acc: 0.996 - ETA: 0s - loss: 0.0112 - acc: 0.997 - ETA: 0s - loss: 0.0114 - acc: 0.996 - ETA: 0s - loss: 0.0112 - acc: 0.996 - ETA: 0s - loss: 0.0115 - acc: 0.996 - ETA: 0s - loss: 0.0114 - acc: 0.996 - ETA: 0s - loss: 0.0115 - acc: 0.996 - ETA: 0s - loss: 0.0112 - acc: 0.996 - ETA: 0s - loss: 0.0109 - acc: 0.996 - ETA: 0s - loss: 0.0106 - acc: 0.996 - ETA: 0s - loss: 0.0114 - acc: 0.996 - ETA: 0s - loss: 0.0111 - acc: 0.996 - ETA: 0s - loss: 0.0110 - acc: 0.996 - 2s 371us/step - loss: 0.0107 - acc: 0.9970 - val_loss: 0.7599 - val_acc: 0.8503 Epoch 00020: val_loss did not improve we are at Xception_model4 Train on 6680 samples, validate on 835 samples Epoch 1/20 6680/6680 [==============================] - ETA: 51s - loss: 5.5546 - acc: 0.0000e+ - ETA: 27s - loss: 7.8772 - acc: 0.0850 - ETA: 19s - loss: 8.8351 - acc: 0.12 - ETA: 15s - loss: 9.0174 - acc: 0.16 - ETA: 12s - loss: 8.7755 - acc: 0.19 - ETA: 11s - loss: 8.9367 - acc: 0.21 - ETA: 10s - loss: 9.1643 - acc: 0.21 - ETA: 9s - loss: 9.1624 - acc: 0.2275 - ETA: 8s - loss: 9.2030 - acc: 0.238 - ETA: 7s - loss: 9.1207 - acc: 0.254 - ETA: 7s - loss: 9.0699 - acc: 0.267 - ETA: 6s - loss: 9.0091 - acc: 0.282 - ETA: 6s - loss: 8.9451 - acc: 0.293 - ETA: 6s - loss: 8.8660 - acc: 0.303 - ETA: 5s - loss: 8.9164 - acc: 0.307 - ETA: 5s - loss: 8.8352 - acc: 0.316 - ETA: 5s - loss: 8.8544 - acc: 0.318 - ETA: 5s - loss: 8.8256 - acc: 0.323 - ETA: 4s - loss: 8.7611 - acc: 0.330 - ETA: 4s - loss: 8.6958 - acc: 0.336 - ETA: 4s - loss: 8.7180 - acc: 0.335 - ETA: 4s - loss: 8.6889 - acc: 0.340 - ETA: 4s - loss: 8.6637 - acc: 0.343 - ETA: 4s - loss: 8.6637 - acc: 0.345 - ETA: 3s - loss: 8.7087 - acc: 0.343 - ETA: 3s - loss: 8.6584 - acc: 0.348 - ETA: 3s - loss: 8.6113 - acc: 0.353 - ETA: 3s - loss: 8.5701 - acc: 0.358 - ETA: 3s - loss: 8.5301 - acc: 0.361 - ETA: 3s - loss: 8.5134 - acc: 0.363 - ETA: 3s - loss: 8.5076 - acc: 0.364 - ETA: 3s - loss: 8.5132 - acc: 0.364 - ETA: 2s - loss: 8.5009 - acc: 0.363 - ETA: 2s - loss: 8.4601 - acc: 0.367 - ETA: 2s - loss: 8.4643 - acc: 0.367 - ETA: 2s - loss: 8.4727 - acc: 0.368 - ETA: 2s - loss: 8.4942 - acc: 0.368 - ETA: 2s - loss: 8.4826 - acc: 0.371 - ETA: 2s - loss: 8.4707 - acc: 0.372 - ETA: 2s - loss: 8.4897 - acc: 0.372 - ETA: 2s - loss: 8.4843 - acc: 0.373 - ETA: 2s - loss: 8.4822 - acc: 0.375 - ETA: 1s - loss: 8.4913 - acc: 0.375 - ETA: 1s - loss: 8.4796 - acc: 0.378 - ETA: 1s - loss: 8.4567 - acc: 0.380 - ETA: 1s - loss: 8.4831 - acc: 0.379 - ETA: 1s - loss: 8.4770 - acc: 0.381 - ETA: 1s - loss: 8.4691 - acc: 0.383 - ETA: 1s - loss: 8.4482 - acc: 0.385 - ETA: 1s - loss: 8.4364 - acc: 0.386 - ETA: 1s - loss: 8.4221 - acc: 0.388 - ETA: 1s - loss: 8.4351 - acc: 0.388 - ETA: 1s - loss: 8.4277 - acc: 0.388 - ETA: 0s - loss: 8.4394 - acc: 0.388 - ETA: 0s - loss: 8.4106 - acc: 0.390 - ETA: 0s - loss: 8.4052 - acc: 0.391 - ETA: 0s - loss: 8.4219 - acc: 0.390 - ETA: 0s - loss: 8.3966 - acc: 0.392 - ETA: 0s - loss: 8.3846 - acc: 0.393 - ETA: 0s - loss: 8.3904 - acc: 0.393 - ETA: 0s - loss: 8.3781 - acc: 0.394 - ETA: 0s - loss: 8.3468 - acc: 0.396 - ETA: 0s - loss: 8.3552 - acc: 0.396 - ETA: 0s - loss: 8.3460 - acc: 0.396 - ETA: 0s - loss: 8.3373 - acc: 0.397 - ETA: 0s - loss: 8.3359 - acc: 0.398 - 5s 797us/step - loss: 8.3273 - acc: 0.3993 - val_loss: 7.9430 - val_acc: 0.4479 Epoch 00001: val_loss improved from inf to 7.94301, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 2/20 6680/6680 [==============================] - ETA: 3s - loss: 7.5331 - acc: 0.500 - ETA: 4s - loss: 7.3039 - acc: 0.505 - ETA: 4s - loss: 7.4268 - acc: 0.503 - ETA: 3s - loss: 7.9466 - acc: 0.470 - ETA: 3s - loss: 7.8922 - acc: 0.470 - ETA: 3s - loss: 7.8329 - acc: 0.475 - ETA: 3s - loss: 7.8321 - acc: 0.472 - ETA: 3s - loss: 7.7846 - acc: 0.477 - ETA: 3s - loss: 7.8108 - acc: 0.475 - ETA: 3s - loss: 7.7745 - acc: 0.479 - ETA: 3s - loss: 7.7327 - acc: 0.480 - ETA: 3s - loss: 7.8201 - acc: 0.475 - ETA: 3s - loss: 7.8748 - acc: 0.474 - ETA: 3s - loss: 7.8217 - acc: 0.476 - ETA: 3s - loss: 7.8114 - acc: 0.478 - ETA: 3s - loss: 7.8949 - acc: 0.473 - ETA: 3s - loss: 7.8807 - acc: 0.474 - ETA: 3s - loss: 7.8178 - acc: 0.478 - ETA: 3s - loss: 7.8338 - acc: 0.476 - ETA: 3s - loss: 7.7927 - acc: 0.478 - ETA: 2s - loss: 7.7486 - acc: 0.481 - ETA: 2s - loss: 7.7631 - acc: 0.479 - ETA: 2s - loss: 7.7751 - acc: 0.478 - ETA: 2s - loss: 7.7607 - acc: 0.481 - ETA: 2s - loss: 7.7570 - acc: 0.482 - ETA: 2s - loss: 7.7688 - acc: 0.481 - ETA: 2s - loss: 7.7794 - acc: 0.480 - ETA: 2s - loss: 7.8087 - acc: 0.478 - ETA: 2s - loss: 7.8143 - acc: 0.478 - ETA: 2s - loss: 7.8172 - acc: 0.478 - ETA: 2s - loss: 7.8005 - acc: 0.479 - ETA: 2s - loss: 7.7638 - acc: 0.481 - ETA: 2s - loss: 7.7614 - acc: 0.482 - ETA: 2s - loss: 7.7629 - acc: 0.482 - ETA: 2s - loss: 7.6906 - acc: 0.486 - ETA: 1s - loss: 7.6788 - acc: 0.486 - ETA: 1s - loss: 7.6839 - acc: 0.486 - ETA: 1s - loss: 7.6578 - acc: 0.489 - ETA: 1s - loss: 7.6650 - acc: 0.488 - ETA: 1s - loss: 7.6664 - acc: 0.488 - ETA: 1s - loss: 7.6564 - acc: 0.489 - ETA: 1s - loss: 7.6222 - acc: 0.491 - ETA: 1s - loss: 7.6241 - acc: 0.490 - ETA: 1s - loss: 7.6258 - acc: 0.490 - ETA: 1s - loss: 7.6034 - acc: 0.492 - ETA: 1s - loss: 7.6352 - acc: 0.490 - ETA: 1s - loss: 7.6285 - acc: 0.491 - ETA: 1s - loss: 7.6131 - acc: 0.492 - ETA: 1s - loss: 7.6266 - acc: 0.491 - ETA: 1s - loss: 7.6462 - acc: 0.490 - ETA: 1s - loss: 7.6545 - acc: 0.489 - ETA: 0s - loss: 7.6624 - acc: 0.488 - ETA: 0s - loss: 7.6579 - acc: 0.489 - ETA: 0s - loss: 7.6488 - acc: 0.490 - ETA: 0s - loss: 7.6545 - acc: 0.490 - ETA: 0s - loss: 7.6306 - acc: 0.491 - ETA: 0s - loss: 7.6286 - acc: 0.491 - ETA: 0s - loss: 7.6319 - acc: 0.491 - ETA: 0s - loss: 7.6334 - acc: 0.491 - ETA: 0s - loss: 7.6256 - acc: 0.492 - ETA: 0s - loss: 7.6062 - acc: 0.493 - ETA: 0s - loss: 7.6209 - acc: 0.492 - ETA: 0s - loss: 7.5993 - acc: 0.494 - ETA: 0s - loss: 7.5913 - acc: 0.494 - ETA: 0s - loss: 7.5953 - acc: 0.493 - ETA: 0s - loss: 7.5873 - acc: 0.493 - 5s 684us/step - loss: 7.5767 - acc: 0.4942 - val_loss: 8.0001 - val_acc: 0.4479 Epoch 00002: val_loss did not improve Epoch 3/20 6680/6680 [==============================] - ETA: 4s - loss: 6.4399 - acc: 0.560 - ETA: 4s - loss: 7.9067 - acc: 0.470 - ETA: 4s - loss: 7.6361 - acc: 0.496 - ETA: 3s - loss: 7.8354 - acc: 0.487 - ETA: 3s - loss: 7.9601 - acc: 0.480 - ETA: 3s - loss: 7.9698 - acc: 0.480 - ETA: 3s - loss: 8.0811 - acc: 0.472 - ETA: 3s - loss: 7.9475 - acc: 0.482 - ETA: 3s - loss: 7.8903 - acc: 0.486 - ETA: 3s - loss: 7.7985 - acc: 0.490 - ETA: 3s - loss: 7.7152 - acc: 0.496 - ETA: 3s - loss: 7.6000 - acc: 0.505 - ETA: 3s - loss: 7.5287 - acc: 0.509 - ETA: 3s - loss: 7.5760 - acc: 0.505 - ETA: 3s - loss: 7.6733 - acc: 0.498 - ETA: 3s - loss: 7.6714 - acc: 0.497 - ETA: 3s - loss: 7.5673 - acc: 0.504 - ETA: 3s - loss: 7.5470 - acc: 0.506 - ETA: 3s - loss: 7.5963 - acc: 0.503 - ETA: 2s - loss: 7.5486 - acc: 0.507 - ETA: 2s - loss: 7.5543 - acc: 0.507 - ETA: 2s - loss: 7.5153 - acc: 0.509 - ETA: 2s - loss: 7.4488 - acc: 0.514 - ETA: 2s - loss: 7.4508 - acc: 0.513 - ETA: 2s - loss: 7.4773 - acc: 0.511 - ETA: 2s - loss: 7.5036 - acc: 0.509 - ETA: 2s - loss: 7.4927 - acc: 0.510 - ETA: 2s - loss: 7.5061 - acc: 0.508 - ETA: 2s - loss: 7.5315 - acc: 0.506 - ETA: 2s - loss: 7.5096 - acc: 0.508 - ETA: 2s - loss: 7.4595 - acc: 0.511 - ETA: 2s - loss: 7.4560 - acc: 0.511 - ETA: 2s - loss: 7.4693 - acc: 0.511 - ETA: 2s - loss: 7.4505 - acc: 0.512 - ETA: 2s - loss: 7.4415 - acc: 0.514 - ETA: 1s - loss: 7.4449 - acc: 0.513 - ETA: 1s - loss: 7.4573 - acc: 0.513 - ETA: 1s - loss: 7.4353 - acc: 0.515 - ETA: 1s - loss: 7.4372 - acc: 0.515 - ETA: 1s - loss: 7.4155 - acc: 0.517 - ETA: 1s - loss: 7.4079 - acc: 0.517 - ETA: 1s - loss: 7.3790 - acc: 0.518 - ETA: 1s - loss: 7.4003 - acc: 0.517 - ETA: 1s - loss: 7.3802 - acc: 0.518 - ETA: 1s - loss: 7.3389 - acc: 0.520 - ETA: 1s - loss: 7.3184 - acc: 0.522 - ETA: 1s - loss: 7.3454 - acc: 0.520 - ETA: 1s - loss: 7.3571 - acc: 0.520 - ETA: 1s - loss: 7.3925 - acc: 0.518 - ETA: 1s - loss: 7.3789 - acc: 0.519 - ETA: 1s - loss: 7.3797 - acc: 0.519 - ETA: 0s - loss: 7.3715 - acc: 0.520 - ETA: 0s - loss: 7.3691 - acc: 0.519 - ETA: 0s - loss: 7.3532 - acc: 0.520 - ETA: 0s - loss: 7.3716 - acc: 0.519 - ETA: 0s - loss: 7.3779 - acc: 0.519 - ETA: 0s - loss: 7.3930 - acc: 0.518 - ETA: 0s - loss: 7.3918 - acc: 0.517 - ETA: 0s - loss: 7.3858 - acc: 0.517 - ETA: 0s - loss: 7.3756 - acc: 0.518 - ETA: 0s - loss: 7.3681 - acc: 0.518 - ETA: 0s - loss: 7.3727 - acc: 0.518 - ETA: 0s - loss: 7.3620 - acc: 0.519 - ETA: 0s - loss: 7.3589 - acc: 0.519 - ETA: 0s - loss: 7.3689 - acc: 0.519 - ETA: 0s - loss: 7.3797 - acc: 0.518 - 5s 683us/step - loss: 7.3960 - acc: 0.5177 - val_loss: 7.8738 - val_acc: 0.4743 Epoch 00003: val_loss improved from 7.94301 to 7.87379, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 4/20 6680/6680 [==============================] - ETA: 3s - loss: 6.7703 - acc: 0.580 - ETA: 3s - loss: 7.0944 - acc: 0.560 - ETA: 3s - loss: 6.7289 - acc: 0.576 - ETA: 3s - loss: 7.0242 - acc: 0.557 - ETA: 3s - loss: 6.9438 - acc: 0.562 - ETA: 3s - loss: 7.0300 - acc: 0.556 - ETA: 3s - loss: 6.9240 - acc: 0.564 - ETA: 3s - loss: 7.0256 - acc: 0.558 - ETA: 3s - loss: 7.1625 - acc: 0.547 - ETA: 3s - loss: 7.0628 - acc: 0.553 - ETA: 3s - loss: 7.1459 - acc: 0.545 - ETA: 3s - loss: 7.1516 - acc: 0.545 - ETA: 3s - loss: 7.2122 - acc: 0.540 - ETA: 3s - loss: 7.2038 - acc: 0.541 - ETA: 3s - loss: 7.2740 - acc: 0.537 - ETA: 3s - loss: 7.2766 - acc: 0.537 - ETA: 3s - loss: 7.3133 - acc: 0.534 - ETA: 3s - loss: 7.2745 - acc: 0.537 - ETA: 3s - loss: 7.1855 - acc: 0.542 - ETA: 2s - loss: 7.2438 - acc: 0.538 - ETA: 2s - loss: 7.2408 - acc: 0.538 - ETA: 2s - loss: 7.2423 - acc: 0.538 - ETA: 2s - loss: 7.2168 - acc: 0.539 - ETA: 2s - loss: 7.2051 - acc: 0.540 - ETA: 2s - loss: 7.2031 - acc: 0.540 - ETA: 2s - loss: 7.2568 - acc: 0.536 - ETA: 2s - loss: 7.2491 - acc: 0.537 - ETA: 2s - loss: 7.2757 - acc: 0.535 - ETA: 2s - loss: 7.2696 - acc: 0.535 - ETA: 2s - loss: 7.2809 - acc: 0.535 - ETA: 2s - loss: 7.2762 - acc: 0.535 - ETA: 2s - loss: 7.2891 - acc: 0.533 - ETA: 2s - loss: 7.2647 - acc: 0.535 - ETA: 2s - loss: 7.2713 - acc: 0.533 - ETA: 2s - loss: 7.2635 - acc: 0.534 - ETA: 1s - loss: 7.2575 - acc: 0.534 - ETA: 1s - loss: 7.2508 - acc: 0.534 - ETA: 1s - loss: 7.2483 - acc: 0.535 - ETA: 1s - loss: 7.2422 - acc: 0.535 - ETA: 1s - loss: 7.2306 - acc: 0.536 - ETA: 1s - loss: 7.2481 - acc: 0.535 - ETA: 1s - loss: 7.2162 - acc: 0.536 - ETA: 1s - loss: 7.2231 - acc: 0.536 - ETA: 1s - loss: 7.2227 - acc: 0.536 - ETA: 1s - loss: 7.2388 - acc: 0.535 - ETA: 1s - loss: 7.2480 - acc: 0.534 - ETA: 1s - loss: 7.2722 - acc: 0.533 - ETA: 1s - loss: 7.3036 - acc: 0.531 - ETA: 1s - loss: 7.3307 - acc: 0.529 - ETA: 1s - loss: 7.3293 - acc: 0.529 - ETA: 1s - loss: 7.3122 - acc: 0.530 - ETA: 0s - loss: 7.3011 - acc: 0.531 - ETA: 0s - loss: 7.2881 - acc: 0.532 - ETA: 0s - loss: 7.2868 - acc: 0.532 - ETA: 0s - loss: 7.2695 - acc: 0.533 - ETA: 0s - loss: 7.2744 - acc: 0.533 - ETA: 0s - loss: 7.2838 - acc: 0.532 - ETA: 0s - loss: 7.2880 - acc: 0.532 - ETA: 0s - loss: 7.2935 - acc: 0.531 - ETA: 0s - loss: 7.3020 - acc: 0.531 - ETA: 0s - loss: 7.2994 - acc: 0.531 - ETA: 0s - loss: 7.2988 - acc: 0.532 - ETA: 0s - loss: 7.3004 - acc: 0.531 - ETA: 0s - loss: 7.3238 - acc: 0.530 - ETA: 0s - loss: 7.3135 - acc: 0.531 - ETA: 0s - loss: 7.3066 - acc: 0.531 - 5s 681us/step - loss: 7.3005 - acc: 0.5316 - val_loss: 7.8347 - val_acc: 0.4814 Epoch 00004: val_loss improved from 7.87379 to 7.83474, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 5/20 6680/6680 [==============================] - ETA: 3s - loss: 7.5993 - acc: 0.510 - ETA: 4s - loss: 7.3458 - acc: 0.535 - ETA: 3s - loss: 7.0746 - acc: 0.550 - ETA: 3s - loss: 7.0868 - acc: 0.550 - ETA: 3s - loss: 7.2404 - acc: 0.538 - ETA: 3s - loss: 7.0019 - acc: 0.551 - ETA: 3s - loss: 7.0156 - acc: 0.550 - ETA: 3s - loss: 6.9452 - acc: 0.551 - ETA: 3s - loss: 6.9986 - acc: 0.548 - ETA: 3s - loss: 7.0390 - acc: 0.547 - ETA: 3s - loss: 7.0609 - acc: 0.546 - ETA: 3s - loss: 7.0021 - acc: 0.550 - ETA: 3s - loss: 7.0576 - acc: 0.547 - ETA: 3s - loss: 7.1378 - acc: 0.542 - ETA: 3s - loss: 7.2064 - acc: 0.538 - ETA: 3s - loss: 7.1998 - acc: 0.538 - ETA: 3s - loss: 7.2422 - acc: 0.536 - ETA: 3s - loss: 7.2085 - acc: 0.538 - ETA: 3s - loss: 7.2887 - acc: 0.533 - ETA: 2s - loss: 7.2591 - acc: 0.535 - ETA: 2s - loss: 7.2831 - acc: 0.534 - ETA: 2s - loss: 7.2757 - acc: 0.534 - ETA: 2s - loss: 7.2249 - acc: 0.537 - ETA: 2s - loss: 7.2346 - acc: 0.536 - ETA: 2s - loss: 7.1748 - acc: 0.540 - ETA: 2s - loss: 7.1492 - acc: 0.542 - ETA: 2s - loss: 7.1555 - acc: 0.541 - ETA: 2s - loss: 7.1694 - acc: 0.540 - ETA: 2s - loss: 7.1537 - acc: 0.542 - ETA: 2s - loss: 7.1571 - acc: 0.542 - ETA: 2s - loss: 7.1471 - acc: 0.543 - ETA: 2s - loss: 7.1776 - acc: 0.540 - ETA: 2s - loss: 7.2002 - acc: 0.539 - ETA: 2s - loss: 7.1753 - acc: 0.541 - ETA: 2s - loss: 7.1604 - acc: 0.542 - ETA: 1s - loss: 7.1295 - acc: 0.544 - ETA: 1s - loss: 7.1060 - acc: 0.545 - ETA: 1s - loss: 7.1635 - acc: 0.542 - ETA: 1s - loss: 7.1956 - acc: 0.540 - ETA: 1s - loss: 7.1960 - acc: 0.540 - ETA: 1s - loss: 7.1706 - acc: 0.541 - ETA: 1s - loss: 7.1735 - acc: 0.541 - ETA: 1s - loss: 7.1942 - acc: 0.540 - ETA: 1s - loss: 7.1899 - acc: 0.540 - ETA: 1s - loss: 7.1917 - acc: 0.540 - ETA: 1s - loss: 7.1730 - acc: 0.541 - ETA: 1s - loss: 7.1650 - acc: 0.542 - ETA: 1s - loss: 7.1583 - acc: 0.542 - ETA: 1s - loss: 7.1346 - acc: 0.544 - ETA: 1s - loss: 7.1460 - acc: 0.543 - ETA: 1s - loss: 7.1645 - acc: 0.542 - ETA: 0s - loss: 7.1648 - acc: 0.542 - ETA: 0s - loss: 7.1789 - acc: 0.541 - ETA: 0s - loss: 7.1852 - acc: 0.541 - ETA: 0s - loss: 7.1775 - acc: 0.541 - ETA: 0s - loss: 7.1925 - acc: 0.540 - ETA: 0s - loss: 7.2218 - acc: 0.539 - ETA: 0s - loss: 7.2155 - acc: 0.539 - ETA: 0s - loss: 7.2248 - acc: 0.539 - ETA: 0s - loss: 7.2092 - acc: 0.540 - ETA: 0s - loss: 7.2294 - acc: 0.539 - ETA: 0s - loss: 7.2335 - acc: 0.538 - ETA: 0s - loss: 7.2454 - acc: 0.537 - ETA: 0s - loss: 7.2521 - acc: 0.536 - ETA: 0s - loss: 7.2572 - acc: 0.536 - ETA: 0s - loss: 7.2629 - acc: 0.535 - 5s 681us/step - loss: 7.2717 - acc: 0.5353 - val_loss: 7.8101 - val_acc: 0.4778 Epoch 00005: val_loss improved from 7.83474 to 7.81007, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 6/20 6680/6680 [==============================] - ETA: 3s - loss: 5.9640 - acc: 0.630 - ETA: 3s - loss: 6.7127 - acc: 0.580 - ETA: 3s - loss: 6.9514 - acc: 0.560 - ETA: 3s - loss: 6.9036 - acc: 0.562 - ETA: 3s - loss: 7.0062 - acc: 0.558 - ETA: 3s - loss: 6.8921 - acc: 0.560 - ETA: 3s - loss: 6.9232 - acc: 0.558 - ETA: 3s - loss: 6.8668 - acc: 0.562 - ETA: 3s - loss: 6.7290 - acc: 0.571 - ETA: 3s - loss: 6.8298 - acc: 0.566 - ETA: 3s - loss: 6.8172 - acc: 0.567 - ETA: 3s - loss: 6.8677 - acc: 0.564 - ETA: 3s - loss: 6.8671 - acc: 0.563 - ETA: 3s - loss: 6.8547 - acc: 0.564 - ETA: 3s - loss: 6.9350 - acc: 0.560 - ETA: 3s - loss: 7.0434 - acc: 0.553 - ETA: 3s - loss: 7.0358 - acc: 0.552 - ETA: 3s - loss: 7.0388 - acc: 0.552 - ETA: 3s - loss: 6.9996 - acc: 0.555 - ETA: 3s - loss: 7.0044 - acc: 0.555 - ETA: 2s - loss: 7.0624 - acc: 0.552 - ETA: 2s - loss: 7.0572 - acc: 0.552 - ETA: 2s - loss: 7.0261 - acc: 0.554 - ETA: 2s - loss: 7.0096 - acc: 0.555 - ETA: 2s - loss: 6.9956 - acc: 0.556 - ETA: 2s - loss: 6.9995 - acc: 0.556 - ETA: 2s - loss: 6.9816 - acc: 0.557 - ETA: 2s - loss: 6.9342 - acc: 0.560 - ETA: 2s - loss: 6.9749 - acc: 0.557 - ETA: 2s - loss: 6.9960 - acc: 0.556 - ETA: 2s - loss: 7.0110 - acc: 0.555 - ETA: 2s - loss: 6.9836 - acc: 0.557 - ETA: 2s - loss: 7.0293 - acc: 0.553 - ETA: 2s - loss: 7.0567 - acc: 0.551 - ETA: 2s - loss: 7.0590 - acc: 0.551 - ETA: 1s - loss: 7.0590 - acc: 0.551 - ETA: 1s - loss: 7.0620 - acc: 0.550 - ETA: 1s - loss: 7.0735 - acc: 0.550 - ETA: 1s - loss: 7.0868 - acc: 0.549 - ETA: 1s - loss: 7.0973 - acc: 0.549 - ETA: 1s - loss: 7.0831 - acc: 0.550 - ETA: 1s - loss: 7.1099 - acc: 0.548 - ETA: 1s - loss: 7.1033 - acc: 0.548 - ETA: 1s - loss: 7.1178 - acc: 0.548 - ETA: 1s - loss: 7.1351 - acc: 0.547 - ETA: 1s - loss: 7.1398 - acc: 0.546 - ETA: 1s - loss: 7.1496 - acc: 0.546 - ETA: 1s - loss: 7.1278 - acc: 0.547 - ETA: 1s - loss: 7.1146 - acc: 0.548 - ETA: 1s - loss: 7.1409 - acc: 0.546 - ETA: 1s - loss: 7.1418 - acc: 0.546 - ETA: 0s - loss: 7.1515 - acc: 0.546 - ETA: 0s - loss: 7.1435 - acc: 0.546 - ETA: 0s - loss: 7.1521 - acc: 0.545 - ETA: 0s - loss: 7.1571 - acc: 0.545 - ETA: 0s - loss: 7.1503 - acc: 0.545 - ETA: 0s - loss: 7.1566 - acc: 0.545 - ETA: 0s - loss: 7.1458 - acc: 0.546 - ETA: 0s - loss: 7.1262 - acc: 0.547 - ETA: 0s - loss: 7.1112 - acc: 0.548 - ETA: 0s - loss: 7.1300 - acc: 0.547 - ETA: 0s - loss: 7.1245 - acc: 0.547 - ETA: 0s - loss: 7.1266 - acc: 0.547 - ETA: 0s - loss: 7.1578 - acc: 0.545 - ETA: 0s - loss: 7.1500 - acc: 0.545 - ETA: 0s - loss: 7.1463 - acc: 0.545 - 5s 682us/step - loss: 7.1407 - acc: 0.5460 - val_loss: 7.7843 - val_acc: 0.4778 Epoch 00006: val_loss improved from 7.81007 to 7.78435, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 7/20 6680/6680 [==============================] - ETA: 4s - loss: 7.2536 - acc: 0.550 - ETA: 4s - loss: 7.1993 - acc: 0.545 - ETA: 4s - loss: 7.2177 - acc: 0.546 - ETA: 3s - loss: 7.4315 - acc: 0.532 - ETA: 3s - loss: 7.1711 - acc: 0.550 - ETA: 3s - loss: 7.2659 - acc: 0.545 - ETA: 3s - loss: 7.2783 - acc: 0.544 - ETA: 3s - loss: 7.1543 - acc: 0.552 - ETA: 3s - loss: 7.1727 - acc: 0.551 - ETA: 3s - loss: 7.1489 - acc: 0.553 - ETA: 3s - loss: 7.0024 - acc: 0.561 - ETA: 3s - loss: 6.9461 - acc: 0.565 - ETA: 3s - loss: 7.0070 - acc: 0.561 - ETA: 3s - loss: 7.0264 - acc: 0.560 - ETA: 3s - loss: 6.9988 - acc: 0.562 - ETA: 3s - loss: 6.9643 - acc: 0.564 - ETA: 3s - loss: 7.0067 - acc: 0.561 - ETA: 3s - loss: 7.0278 - acc: 0.560 - ETA: 3s - loss: 6.9382 - acc: 0.566 - ETA: 2s - loss: 7.0023 - acc: 0.562 - ETA: 2s - loss: 7.0335 - acc: 0.560 - ETA: 2s - loss: 7.0299 - acc: 0.560 - ETA: 2s - loss: 7.0122 - acc: 0.561 - ETA: 2s - loss: 7.0165 - acc: 0.560 - ETA: 2s - loss: 7.0776 - acc: 0.557 - ETA: 2s - loss: 7.0907 - acc: 0.556 - ETA: 2s - loss: 7.0609 - acc: 0.558 - ETA: 2s - loss: 7.1028 - acc: 0.555 - ETA: 2s - loss: 7.1142 - acc: 0.554 - ETA: 2s - loss: 7.1046 - acc: 0.555 - ETA: 2s - loss: 7.1164 - acc: 0.553 - ETA: 2s - loss: 7.1411 - acc: 0.552 - ETA: 2s - loss: 7.1552 - acc: 0.551 - ETA: 2s - loss: 7.1418 - acc: 0.552 - ETA: 2s - loss: 7.1597 - acc: 0.550 - ETA: 1s - loss: 7.1808 - acc: 0.549 - ETA: 1s - loss: 7.1437 - acc: 0.551 - ETA: 1s - loss: 7.1720 - acc: 0.549 - ETA: 1s - loss: 7.1257 - acc: 0.552 - ETA: 1s - loss: 7.1485 - acc: 0.550 - ETA: 1s - loss: 7.1707 - acc: 0.549 - ETA: 1s - loss: 7.1298 - acc: 0.551 - ETA: 1s - loss: 7.1255 - acc: 0.551 - ETA: 1s - loss: 7.1174 - acc: 0.551 - ETA: 1s - loss: 7.1066 - acc: 0.552 - ETA: 1s - loss: 7.0941 - acc: 0.553 - ETA: 1s - loss: 7.0906 - acc: 0.553 - ETA: 1s - loss: 7.0840 - acc: 0.554 - ETA: 1s - loss: 7.0745 - acc: 0.554 - ETA: 1s - loss: 7.0711 - acc: 0.554 - ETA: 1s - loss: 7.0661 - acc: 0.554 - ETA: 0s - loss: 7.0894 - acc: 0.553 - ETA: 0s - loss: 7.0775 - acc: 0.553 - ETA: 0s - loss: 7.0665 - acc: 0.554 - ETA: 0s - loss: 7.0933 - acc: 0.552 - ETA: 0s - loss: 7.0951 - acc: 0.552 - ETA: 0s - loss: 7.0843 - acc: 0.553 - ETA: 0s - loss: 7.0872 - acc: 0.553 - ETA: 0s - loss: 7.0890 - acc: 0.553 - ETA: 0s - loss: 7.0861 - acc: 0.553 - ETA: 0s - loss: 7.0943 - acc: 0.552 - ETA: 0s - loss: 7.0839 - acc: 0.553 - ETA: 0s - loss: 7.0917 - acc: 0.552 - ETA: 0s - loss: 7.0792 - acc: 0.553 - ETA: 0s - loss: 7.0928 - acc: 0.552 - ETA: 0s - loss: 7.0806 - acc: 0.553 - 5s 674us/step - loss: 7.0892 - acc: 0.5531 - val_loss: 7.7660 - val_acc: 0.4802 Epoch 00007: val_loss improved from 7.78435 to 7.76598, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 8/20 6680/6680 [==============================] - ETA: 3s - loss: 6.3125 - acc: 0.600 - ETA: 4s - loss: 6.9444 - acc: 0.565 - ETA: 4s - loss: 6.5654 - acc: 0.590 - ETA: 3s - loss: 6.7996 - acc: 0.572 - ETA: 3s - loss: 6.9548 - acc: 0.564 - ETA: 3s - loss: 6.9014 - acc: 0.565 - ETA: 3s - loss: 6.8136 - acc: 0.571 - ETA: 3s - loss: 6.8282 - acc: 0.571 - ETA: 3s - loss: 6.8576 - acc: 0.570 - ETA: 3s - loss: 6.8349 - acc: 0.571 - ETA: 3s - loss: 6.8439 - acc: 0.570 - ETA: 3s - loss: 6.8377 - acc: 0.571 - ETA: 3s - loss: 6.9564 - acc: 0.564 - ETA: 3s - loss: 6.9916 - acc: 0.562 - ETA: 3s - loss: 7.0093 - acc: 0.561 - ETA: 3s - loss: 6.9843 - acc: 0.563 - ETA: 3s - loss: 6.9236 - acc: 0.566 - ETA: 3s - loss: 6.8792 - acc: 0.569 - ETA: 2s - loss: 6.9041 - acc: 0.567 - ETA: 2s - loss: 6.9571 - acc: 0.564 - ETA: 2s - loss: 6.9638 - acc: 0.563 - ETA: 2s - loss: 7.0077 - acc: 0.560 - ETA: 2s - loss: 6.9133 - acc: 0.567 - ETA: 2s - loss: 6.8747 - acc: 0.568 - ETA: 2s - loss: 6.8891 - acc: 0.568 - ETA: 2s - loss: 6.9373 - acc: 0.564 - ETA: 2s - loss: 6.9439 - acc: 0.564 - ETA: 2s - loss: 6.9770 - acc: 0.561 - ETA: 2s - loss: 6.9855 - acc: 0.560 - ETA: 2s - loss: 6.9979 - acc: 0.559 - ETA: 2s - loss: 6.9661 - acc: 0.561 - ETA: 2s - loss: 6.9451 - acc: 0.562 - ETA: 2s - loss: 6.9638 - acc: 0.560 - ETA: 2s - loss: 6.9913 - acc: 0.559 - ETA: 2s - loss: 7.0234 - acc: 0.557 - ETA: 1s - loss: 7.0567 - acc: 0.555 - ETA: 1s - loss: 7.0586 - acc: 0.555 - ETA: 1s - loss: 7.0807 - acc: 0.554 - ETA: 1s - loss: 7.0975 - acc: 0.553 - ETA: 1s - loss: 7.0974 - acc: 0.553 - ETA: 1s - loss: 7.0738 - acc: 0.555 - ETA: 1s - loss: 7.0666 - acc: 0.555 - ETA: 1s - loss: 7.0672 - acc: 0.555 - ETA: 1s - loss: 7.0645 - acc: 0.555 - ETA: 1s - loss: 7.0832 - acc: 0.554 - ETA: 1s - loss: 7.1102 - acc: 0.552 - ETA: 1s - loss: 7.1098 - acc: 0.553 - ETA: 1s - loss: 7.1161 - acc: 0.552 - ETA: 1s - loss: 7.0927 - acc: 0.554 - ETA: 1s - loss: 7.0863 - acc: 0.554 - ETA: 0s - loss: 7.0706 - acc: 0.555 - ETA: 0s - loss: 7.0617 - acc: 0.556 - ETA: 0s - loss: 7.0349 - acc: 0.558 - ETA: 0s - loss: 7.0300 - acc: 0.558 - ETA: 0s - loss: 7.0165 - acc: 0.559 - ETA: 0s - loss: 7.0438 - acc: 0.558 - ETA: 0s - loss: 7.0698 - acc: 0.556 - ETA: 0s - loss: 7.0845 - acc: 0.555 - ETA: 0s - loss: 7.0973 - acc: 0.554 - ETA: 0s - loss: 7.0861 - acc: 0.554 - ETA: 0s - loss: 7.1143 - acc: 0.553 - ETA: 0s - loss: 7.1166 - acc: 0.552 - ETA: 0s - loss: 7.1060 - acc: 0.553 - ETA: 0s - loss: 7.1039 - acc: 0.553 - ETA: 0s - loss: 7.0946 - acc: 0.554 - ETA: 0s - loss: 7.0870 - acc: 0.554 - 4s 669us/step - loss: 7.0877 - acc: 0.5542 - val_loss: 7.7443 - val_acc: 0.4886 Epoch 00008: val_loss improved from 7.76598 to 7.74427, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 9/20 6680/6680 [==============================] - ETA: 4s - loss: 5.4842 - acc: 0.660 - ETA: 4s - loss: 6.7721 - acc: 0.580 - ETA: 4s - loss: 7.1476 - acc: 0.556 - ETA: 4s - loss: 7.1740 - acc: 0.555 - ETA: 3s - loss: 7.2339 - acc: 0.550 - ETA: 3s - loss: 7.1836 - acc: 0.553 - ETA: 3s - loss: 7.0818 - acc: 0.558 - ETA: 3s - loss: 6.9507 - acc: 0.566 - ETA: 3s - loss: 6.8135 - acc: 0.573 - ETA: 3s - loss: 6.9386 - acc: 0.566 - ETA: 3s - loss: 6.9672 - acc: 0.564 - ETA: 3s - loss: 7.0366 - acc: 0.560 - ETA: 3s - loss: 7.1279 - acc: 0.554 - ETA: 3s - loss: 7.0798 - acc: 0.557 - ETA: 3s - loss: 7.1129 - acc: 0.556 - ETA: 3s - loss: 7.1671 - acc: 0.551 - ETA: 3s - loss: 7.1250 - acc: 0.554 - ETA: 3s - loss: 7.0481 - acc: 0.558 - ETA: 3s - loss: 7.0263 - acc: 0.560 - ETA: 3s - loss: 7.0803 - acc: 0.556 - ETA: 2s - loss: 7.0724 - acc: 0.556 - ETA: 2s - loss: 7.0514 - acc: 0.558 - ETA: 2s - loss: 7.1162 - acc: 0.554 - ETA: 2s - loss: 7.0889 - acc: 0.555 - ETA: 2s - loss: 7.0807 - acc: 0.555 - ETA: 2s - loss: 7.0689 - acc: 0.556 - ETA: 2s - loss: 7.0904 - acc: 0.555 - ETA: 2s - loss: 7.1193 - acc: 0.553 - ETA: 2s - loss: 7.1307 - acc: 0.552 - ETA: 2s - loss: 7.1671 - acc: 0.550 - ETA: 2s - loss: 7.1354 - acc: 0.552 - ETA: 2s - loss: 7.1748 - acc: 0.550 - ETA: 2s - loss: 7.1434 - acc: 0.551 - ETA: 2s - loss: 7.0992 - acc: 0.554 - ETA: 2s - loss: 7.1497 - acc: 0.551 - ETA: 1s - loss: 7.1308 - acc: 0.552 - ETA: 1s - loss: 7.1646 - acc: 0.550 - ETA: 1s - loss: 7.1586 - acc: 0.551 - ETA: 1s - loss: 7.1541 - acc: 0.551 - ETA: 1s - loss: 7.1768 - acc: 0.550 - ETA: 1s - loss: 7.1433 - acc: 0.552 - ETA: 1s - loss: 7.1421 - acc: 0.552 - ETA: 1s - loss: 7.1147 - acc: 0.554 - ETA: 1s - loss: 7.1436 - acc: 0.552 - ETA: 1s - loss: 7.1388 - acc: 0.553 - ETA: 1s - loss: 7.1483 - acc: 0.552 - ETA: 1s - loss: 7.1437 - acc: 0.553 - ETA: 1s - loss: 7.1266 - acc: 0.554 - ETA: 1s - loss: 7.1259 - acc: 0.554 - ETA: 1s - loss: 7.1125 - acc: 0.554 - ETA: 1s - loss: 7.0963 - acc: 0.555 - ETA: 0s - loss: 7.0993 - acc: 0.555 - ETA: 0s - loss: 7.0933 - acc: 0.556 - ETA: 0s - loss: 7.0873 - acc: 0.556 - ETA: 0s - loss: 7.0889 - acc: 0.556 - ETA: 0s - loss: 7.0863 - acc: 0.556 - ETA: 0s - loss: 7.1033 - acc: 0.555 - ETA: 0s - loss: 7.0909 - acc: 0.555 - ETA: 0s - loss: 7.0858 - acc: 0.556 - ETA: 0s - loss: 7.0760 - acc: 0.556 - ETA: 0s - loss: 7.0736 - acc: 0.556 - ETA: 0s - loss: 7.0666 - acc: 0.557 - ETA: 0s - loss: 7.0850 - acc: 0.556 - ETA: 0s - loss: 7.0747 - acc: 0.556 - ETA: 0s - loss: 7.0516 - acc: 0.558 - ETA: 0s - loss: 7.0756 - acc: 0.556 - 5s 680us/step - loss: 7.0754 - acc: 0.5564 - val_loss: 7.6564 - val_acc: 0.5006 Epoch 00009: val_loss improved from 7.74427 to 7.65635, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 10/20 6680/6680 [==============================] - ETA: 4s - loss: 6.2861 - acc: 0.610 - ETA: 4s - loss: 7.4274 - acc: 0.535 - ETA: 4s - loss: 7.1015 - acc: 0.556 - ETA: 4s - loss: 7.1797 - acc: 0.552 - ETA: 4s - loss: 7.0978 - acc: 0.558 - ETA: 3s - loss: 7.0431 - acc: 0.561 - ETA: 3s - loss: 7.1192 - acc: 0.557 - ETA: 3s - loss: 7.0559 - acc: 0.561 - ETA: 3s - loss: 7.0421 - acc: 0.562 - ETA: 3s - loss: 6.9020 - acc: 0.571 - ETA: 3s - loss: 6.9488 - acc: 0.568 - ETA: 3s - loss: 7.0548 - acc: 0.561 - ETA: 3s - loss: 7.2064 - acc: 0.552 - ETA: 3s - loss: 7.1870 - acc: 0.553 - ETA: 3s - loss: 7.1822 - acc: 0.553 - ETA: 3s - loss: 7.1161 - acc: 0.557 - ETA: 3s - loss: 7.1052 - acc: 0.558 - ETA: 3s - loss: 7.0507 - acc: 0.561 - ETA: 3s - loss: 7.0590 - acc: 0.561 - ETA: 3s - loss: 7.0534 - acc: 0.561 - ETA: 2s - loss: 7.0169 - acc: 0.563 - ETA: 2s - loss: 7.0643 - acc: 0.560 - ETA: 2s - loss: 7.0246 - acc: 0.562 - ETA: 2s - loss: 7.0148 - acc: 0.562 - ETA: 2s - loss: 7.0566 - acc: 0.560 - ETA: 2s - loss: 7.0827 - acc: 0.558 - ETA: 2s - loss: 7.0893 - acc: 0.558 - ETA: 2s - loss: 7.0721 - acc: 0.559 - ETA: 2s - loss: 7.0895 - acc: 0.558 - ETA: 2s - loss: 7.0590 - acc: 0.560 - ETA: 2s - loss: 7.0913 - acc: 0.558 - ETA: 2s - loss: 7.1666 - acc: 0.553 - ETA: 2s - loss: 7.1595 - acc: 0.554 - ETA: 2s - loss: 7.1527 - acc: 0.554 - ETA: 2s - loss: 7.1280 - acc: 0.556 - ETA: 1s - loss: 7.1225 - acc: 0.556 - ETA: 1s - loss: 7.0878 - acc: 0.558 - ETA: 1s - loss: 7.0667 - acc: 0.559 - ETA: 1s - loss: 7.0598 - acc: 0.559 - ETA: 1s - loss: 7.0687 - acc: 0.559 - ETA: 1s - loss: 7.0734 - acc: 0.558 - ETA: 1s - loss: 7.1048 - acc: 0.556 - ETA: 1s - loss: 7.0895 - acc: 0.557 - ETA: 1s - loss: 7.0946 - acc: 0.557 - ETA: 1s - loss: 7.1303 - acc: 0.555 - ETA: 1s - loss: 7.1015 - acc: 0.557 - ETA: 1s - loss: 7.1047 - acc: 0.556 - ETA: 1s - loss: 7.1078 - acc: 0.556 - ETA: 1s - loss: 7.1107 - acc: 0.556 - ETA: 1s - loss: 7.1056 - acc: 0.556 - ETA: 1s - loss: 7.1148 - acc: 0.556 - ETA: 0s - loss: 7.1426 - acc: 0.554 - ETA: 0s - loss: 7.1485 - acc: 0.554 - ETA: 0s - loss: 7.1386 - acc: 0.554 - ETA: 0s - loss: 7.1190 - acc: 0.555 - ETA: 0s - loss: 7.0898 - acc: 0.557 - ETA: 0s - loss: 7.0769 - acc: 0.558 - ETA: 0s - loss: 7.0913 - acc: 0.557 - ETA: 0s - loss: 7.0732 - acc: 0.558 - ETA: 0s - loss: 7.0950 - acc: 0.557 - ETA: 0s - loss: 7.0976 - acc: 0.556 - ETA: 0s - loss: 7.0745 - acc: 0.558 - ETA: 0s - loss: 7.0645 - acc: 0.558 - ETA: 0s - loss: 7.0624 - acc: 0.559 - ETA: 0s - loss: 7.0480 - acc: 0.560 - ETA: 0s - loss: 7.0560 - acc: 0.559 - 5s 683us/step - loss: 7.0584 - acc: 0.5594 - val_loss: 7.8249 - val_acc: 0.4790 Epoch 00010: val_loss did not improve Epoch 11/20 6680/6680 [==============================] - ETA: 4s - loss: 7.8074 - acc: 0.510 - ETA: 4s - loss: 7.1876 - acc: 0.545 - ETA: 4s - loss: 6.7292 - acc: 0.573 - ETA: 3s - loss: 6.8603 - acc: 0.567 - ETA: 3s - loss: 6.6947 - acc: 0.578 - ETA: 3s - loss: 6.7431 - acc: 0.573 - ETA: 3s - loss: 6.7295 - acc: 0.572 - ETA: 3s - loss: 6.7965 - acc: 0.568 - ETA: 3s - loss: 6.8114 - acc: 0.568 - ETA: 3s - loss: 6.8513 - acc: 0.567 - ETA: 3s - loss: 6.9193 - acc: 0.562 - ETA: 3s - loss: 6.9337 - acc: 0.562 - ETA: 3s - loss: 6.9087 - acc: 0.564 - ETA: 3s - loss: 6.8874 - acc: 0.566 - ETA: 3s - loss: 6.8674 - acc: 0.568 - ETA: 3s - loss: 6.8735 - acc: 0.566 - ETA: 3s - loss: 6.8989 - acc: 0.565 - ETA: 3s - loss: 6.8559 - acc: 0.568 - ETA: 3s - loss: 6.8514 - acc: 0.568 - ETA: 3s - loss: 6.8800 - acc: 0.567 - ETA: 2s - loss: 6.8286 - acc: 0.570 - ETA: 2s - loss: 6.7820 - acc: 0.573 - ETA: 2s - loss: 6.8025 - acc: 0.572 - ETA: 2s - loss: 6.8549 - acc: 0.569 - ETA: 2s - loss: 6.8923 - acc: 0.566 - ETA: 2s - loss: 6.8938 - acc: 0.566 - ETA: 2s - loss: 6.8534 - acc: 0.569 - ETA: 2s - loss: 6.8678 - acc: 0.568 - ETA: 2s - loss: 6.9088 - acc: 0.566 - ETA: 2s - loss: 6.9096 - acc: 0.566 - ETA: 2s - loss: 6.9162 - acc: 0.566 - ETA: 2s - loss: 6.9217 - acc: 0.565 - ETA: 2s - loss: 6.9318 - acc: 0.565 - ETA: 2s - loss: 6.9175 - acc: 0.566 - ETA: 2s - loss: 6.9229 - acc: 0.566 - ETA: 1s - loss: 6.9014 - acc: 0.567 - ETA: 1s - loss: 6.8983 - acc: 0.567 - ETA: 1s - loss: 6.9119 - acc: 0.566 - ETA: 1s - loss: 6.9248 - acc: 0.565 - ETA: 1s - loss: 6.9492 - acc: 0.564 - ETA: 1s - loss: 6.9440 - acc: 0.564 - ETA: 1s - loss: 6.9424 - acc: 0.564 - ETA: 1s - loss: 6.9687 - acc: 0.562 - ETA: 1s - loss: 6.9605 - acc: 0.563 - ETA: 1s - loss: 6.9535 - acc: 0.563 - ETA: 1s - loss: 6.9635 - acc: 0.563 - ETA: 1s - loss: 6.9594 - acc: 0.563 - ETA: 1s - loss: 6.9621 - acc: 0.563 - ETA: 1s - loss: 6.9822 - acc: 0.561 - ETA: 1s - loss: 6.9816 - acc: 0.561 - ETA: 1s - loss: 6.9927 - acc: 0.561 - ETA: 0s - loss: 7.0102 - acc: 0.560 - ETA: 0s - loss: 7.0188 - acc: 0.559 - ETA: 0s - loss: 7.0143 - acc: 0.559 - ETA: 0s - loss: 7.0366 - acc: 0.558 - ETA: 0s - loss: 7.0434 - acc: 0.558 - ETA: 0s - loss: 7.0499 - acc: 0.557 - ETA: 0s - loss: 7.0840 - acc: 0.555 - ETA: 0s - loss: 7.0789 - acc: 0.555 - ETA: 0s - loss: 7.0791 - acc: 0.556 - ETA: 0s - loss: 7.0952 - acc: 0.555 - ETA: 0s - loss: 7.0848 - acc: 0.555 - ETA: 0s - loss: 7.0926 - acc: 0.555 - ETA: 0s - loss: 7.0724 - acc: 0.556 - ETA: 0s - loss: 7.0702 - acc: 0.556 - ETA: 0s - loss: 7.0779 - acc: 0.556 - 5s 680us/step - loss: 7.0608 - acc: 0.5576 - val_loss: 7.6686 - val_acc: 0.4982 Epoch 00011: val_loss did not improve Epoch 12/20 6680/6680 [==============================] - ETA: 3s - loss: 7.7367 - acc: 0.520 - ETA: 3s - loss: 7.0516 - acc: 0.560 - ETA: 3s - loss: 7.1750 - acc: 0.553 - ETA: 3s - loss: 7.1268 - acc: 0.555 - ETA: 3s - loss: 7.3456 - acc: 0.542 - ETA: 3s - loss: 7.2888 - acc: 0.545 - ETA: 3s - loss: 7.1692 - acc: 0.552 - ETA: 3s - loss: 7.3207 - acc: 0.543 - ETA: 3s - loss: 7.4386 - acc: 0.536 - ETA: 3s - loss: 7.3609 - acc: 0.541 - ETA: 3s - loss: 7.2975 - acc: 0.543 - ETA: 3s - loss: 7.2950 - acc: 0.543 - ETA: 3s - loss: 7.3548 - acc: 0.539 - ETA: 3s - loss: 7.3366 - acc: 0.540 - ETA: 3s - loss: 7.2773 - acc: 0.544 - ETA: 3s - loss: 7.2681 - acc: 0.543 - ETA: 3s - loss: 7.2483 - acc: 0.545 - ETA: 3s - loss: 7.2396 - acc: 0.546 - ETA: 2s - loss: 7.2318 - acc: 0.546 - ETA: 2s - loss: 7.2007 - acc: 0.549 - ETA: 2s - loss: 7.2339 - acc: 0.547 - ETA: 2s - loss: 7.2065 - acc: 0.548 - ETA: 2s - loss: 7.2156 - acc: 0.548 - ETA: 2s - loss: 7.2187 - acc: 0.547 - ETA: 2s - loss: 7.2395 - acc: 0.546 - ETA: 2s - loss: 7.2586 - acc: 0.545 - ETA: 2s - loss: 7.2703 - acc: 0.545 - ETA: 2s - loss: 7.2639 - acc: 0.545 - ETA: 2s - loss: 7.2247 - acc: 0.548 - ETA: 2s - loss: 7.1880 - acc: 0.550 - ETA: 2s - loss: 7.1537 - acc: 0.552 - ETA: 2s - loss: 7.1568 - acc: 0.552 - ETA: 2s - loss: 7.1256 - acc: 0.554 - ETA: 2s - loss: 7.1388 - acc: 0.554 - ETA: 2s - loss: 7.1421 - acc: 0.554 - ETA: 1s - loss: 7.1183 - acc: 0.555 - ETA: 1s - loss: 7.1481 - acc: 0.553 - ETA: 1s - loss: 7.1169 - acc: 0.555 - ETA: 1s - loss: 7.0708 - acc: 0.558 - ETA: 1s - loss: 7.0552 - acc: 0.559 - ETA: 1s - loss: 7.0129 - acc: 0.562 - ETA: 1s - loss: 7.0033 - acc: 0.563 - ETA: 1s - loss: 7.0091 - acc: 0.562 - ETA: 1s - loss: 6.9817 - acc: 0.564 - ETA: 1s - loss: 6.9805 - acc: 0.564 - ETA: 1s - loss: 6.9905 - acc: 0.563 - ETA: 1s - loss: 6.9927 - acc: 0.563 - ETA: 1s - loss: 6.9914 - acc: 0.564 - ETA: 1s - loss: 6.9770 - acc: 0.564 - ETA: 1s - loss: 6.9774 - acc: 0.564 - ETA: 1s - loss: 6.9717 - acc: 0.565 - ETA: 0s - loss: 7.0050 - acc: 0.563 - ETA: 0s - loss: 7.0036 - acc: 0.563 - ETA: 0s - loss: 6.9873 - acc: 0.564 - ETA: 0s - loss: 7.0068 - acc: 0.563 - ETA: 0s - loss: 7.0112 - acc: 0.562 - ETA: 0s - loss: 7.0127 - acc: 0.562 - ETA: 0s - loss: 7.0224 - acc: 0.562 - ETA: 0s - loss: 7.0236 - acc: 0.562 - ETA: 0s - loss: 7.0382 - acc: 0.561 - ETA: 0s - loss: 7.0371 - acc: 0.561 - ETA: 0s - loss: 7.0328 - acc: 0.561 - ETA: 0s - loss: 7.0286 - acc: 0.561 - ETA: 0s - loss: 7.0321 - acc: 0.561 - ETA: 0s - loss: 7.0207 - acc: 0.562 - ETA: 0s - loss: 7.0526 - acc: 0.560 - 4s 672us/step - loss: 7.0516 - acc: 0.5605 - val_loss: 7.8020 - val_acc: 0.4850 Epoch 00012: val_loss did not improve Epoch 13/20 6680/6680 [==============================] - ETA: 3s - loss: 7.9023 - acc: 0.510 - ETA: 3s - loss: 7.9003 - acc: 0.510 - ETA: 3s - loss: 7.2663 - acc: 0.546 - ETA: 3s - loss: 6.7794 - acc: 0.577 - ETA: 3s - loss: 6.8565 - acc: 0.568 - ETA: 3s - loss: 6.7908 - acc: 0.571 - ETA: 3s - loss: 6.8416 - acc: 0.568 - ETA: 3s - loss: 6.7646 - acc: 0.572 - ETA: 3s - loss: 6.7081 - acc: 0.575 - ETA: 3s - loss: 6.8094 - acc: 0.569 - ETA: 3s - loss: 6.8072 - acc: 0.569 - ETA: 3s - loss: 6.7508 - acc: 0.573 - ETA: 3s - loss: 6.7278 - acc: 0.575 - ETA: 3s - loss: 6.6732 - acc: 0.579 - ETA: 3s - loss: 6.6768 - acc: 0.578 - ETA: 3s - loss: 6.7081 - acc: 0.576 - ETA: 3s - loss: 6.6459 - acc: 0.580 - ETA: 3s - loss: 6.6628 - acc: 0.578 - ETA: 3s - loss: 6.6983 - acc: 0.576 - ETA: 2s - loss: 6.7180 - acc: 0.575 - ETA: 2s - loss: 6.7128 - acc: 0.576 - ETA: 2s - loss: 6.7174 - acc: 0.575 - ETA: 2s - loss: 6.7407 - acc: 0.574 - ETA: 2s - loss: 6.7285 - acc: 0.575 - ETA: 2s - loss: 6.7559 - acc: 0.574 - ETA: 2s - loss: 6.7131 - acc: 0.577 - ETA: 2s - loss: 6.7287 - acc: 0.576 - ETA: 2s - loss: 6.7648 - acc: 0.574 - ETA: 2s - loss: 6.8206 - acc: 0.571 - ETA: 2s - loss: 6.8672 - acc: 0.568 - ETA: 2s - loss: 6.8589 - acc: 0.569 - ETA: 2s - loss: 6.8863 - acc: 0.567 - ETA: 2s - loss: 6.8486 - acc: 0.570 - ETA: 2s - loss: 6.9045 - acc: 0.566 - ETA: 2s - loss: 6.9381 - acc: 0.564 - ETA: 1s - loss: 6.9020 - acc: 0.566 - ETA: 1s - loss: 6.9028 - acc: 0.566 - ETA: 1s - loss: 6.9220 - acc: 0.565 - ETA: 1s - loss: 6.9347 - acc: 0.564 - ETA: 1s - loss: 6.9478 - acc: 0.564 - ETA: 1s - loss: 6.9474 - acc: 0.564 - ETA: 1s - loss: 6.9523 - acc: 0.563 - ETA: 1s - loss: 6.9641 - acc: 0.563 - ETA: 1s - loss: 6.9927 - acc: 0.561 - ETA: 1s - loss: 6.9412 - acc: 0.564 - ETA: 1s - loss: 6.9446 - acc: 0.564 - ETA: 1s - loss: 6.9237 - acc: 0.565 - ETA: 1s - loss: 6.9212 - acc: 0.565 - ETA: 1s - loss: 6.9148 - acc: 0.566 - ETA: 1s - loss: 6.9248 - acc: 0.565 - ETA: 1s - loss: 6.9503 - acc: 0.564 - ETA: 0s - loss: 6.9438 - acc: 0.564 - ETA: 0s - loss: 6.9474 - acc: 0.564 - ETA: 0s - loss: 6.9550 - acc: 0.563 - ETA: 0s - loss: 6.9516 - acc: 0.564 - ETA: 0s - loss: 6.9599 - acc: 0.563 - ETA: 0s - loss: 6.9764 - acc: 0.562 - ETA: 0s - loss: 7.0014 - acc: 0.561 - ETA: 0s - loss: 6.9887 - acc: 0.561 - ETA: 0s - loss: 6.9877 - acc: 0.561 - ETA: 0s - loss: 6.9895 - acc: 0.561 - ETA: 0s - loss: 6.9864 - acc: 0.561 - ETA: 0s - loss: 6.9907 - acc: 0.561 - ETA: 0s - loss: 7.0023 - acc: 0.560 - ETA: 0s - loss: 7.0053 - acc: 0.560 - ETA: 0s - loss: 7.0164 - acc: 0.560 - 4s 672us/step - loss: 7.0217 - acc: 0.5597 - val_loss: 7.6658 - val_acc: 0.4946 Epoch 00013: val_loss did not improve Epoch 14/20 6680/6680 [==============================] - ETA: 4s - loss: 8.5426 - acc: 0.470 - ETA: 4s - loss: 7.6561 - acc: 0.525 - ETA: 4s - loss: 7.5765 - acc: 0.530 - ETA: 4s - loss: 7.1733 - acc: 0.555 - ETA: 4s - loss: 7.1570 - acc: 0.556 - ETA: 4s - loss: 7.3342 - acc: 0.545 - ETA: 3s - loss: 7.3690 - acc: 0.542 - ETA: 3s - loss: 7.2777 - acc: 0.547 - ETA: 3s - loss: 7.1894 - acc: 0.552 - ETA: 3s - loss: 7.1957 - acc: 0.552 - ETA: 3s - loss: 7.1717 - acc: 0.553 - ETA: 3s - loss: 7.0710 - acc: 0.560 - ETA: 3s - loss: 7.0863 - acc: 0.558 - ETA: 3s - loss: 7.0982 - acc: 0.557 - ETA: 3s - loss: 7.0969 - acc: 0.557 - ETA: 3s - loss: 7.0200 - acc: 0.561 - ETA: 3s - loss: 6.9104 - acc: 0.568 - ETA: 3s - loss: 6.9295 - acc: 0.567 - ETA: 3s - loss: 6.8707 - acc: 0.571 - ETA: 3s - loss: 6.8901 - acc: 0.570 - ETA: 2s - loss: 6.8893 - acc: 0.569 - ETA: 2s - loss: 6.8777 - acc: 0.570 - ETA: 2s - loss: 6.9246 - acc: 0.567 - ETA: 2s - loss: 6.8732 - acc: 0.569 - ETA: 2s - loss: 6.8373 - acc: 0.571 - ETA: 2s - loss: 6.8470 - acc: 0.570 - ETA: 2s - loss: 6.8442 - acc: 0.571 - ETA: 2s - loss: 6.8473 - acc: 0.571 - ETA: 2s - loss: 6.8557 - acc: 0.570 - ETA: 2s - loss: 6.8326 - acc: 0.572 - ETA: 2s - loss: 6.8877 - acc: 0.568 - ETA: 2s - loss: 6.8692 - acc: 0.569 - ETA: 2s - loss: 6.9061 - acc: 0.567 - ETA: 2s - loss: 6.9022 - acc: 0.567 - ETA: 2s - loss: 6.9033 - acc: 0.567 - ETA: 1s - loss: 6.9220 - acc: 0.566 - ETA: 1s - loss: 6.8874 - acc: 0.568 - ETA: 1s - loss: 6.9182 - acc: 0.566 - ETA: 1s - loss: 6.9186 - acc: 0.566 - ETA: 1s - loss: 6.9108 - acc: 0.567 - ETA: 1s - loss: 6.9349 - acc: 0.566 - ETA: 1s - loss: 6.9310 - acc: 0.566 - ETA: 1s - loss: 6.9535 - acc: 0.565 - ETA: 1s - loss: 6.9456 - acc: 0.565 - ETA: 1s - loss: 6.9668 - acc: 0.564 - ETA: 1s - loss: 6.9870 - acc: 0.563 - ETA: 1s - loss: 6.9893 - acc: 0.563 - ETA: 1s - loss: 7.0116 - acc: 0.561 - ETA: 1s - loss: 7.0165 - acc: 0.561 - ETA: 1s - loss: 7.0058 - acc: 0.562 - ETA: 1s - loss: 7.0359 - acc: 0.560 - ETA: 0s - loss: 7.0206 - acc: 0.561 - ETA: 0s - loss: 7.0072 - acc: 0.561 - ETA: 0s - loss: 6.9973 - acc: 0.562 - ETA: 0s - loss: 7.0052 - acc: 0.561 - ETA: 0s - loss: 7.0182 - acc: 0.560 - ETA: 0s - loss: 7.0242 - acc: 0.560 - ETA: 0s - loss: 7.0226 - acc: 0.560 - ETA: 0s - loss: 7.0238 - acc: 0.560 - ETA: 0s - loss: 7.0008 - acc: 0.562 - ETA: 0s - loss: 7.0128 - acc: 0.561 - ETA: 0s - loss: 7.0158 - acc: 0.561 - ETA: 0s - loss: 7.0201 - acc: 0.561 - ETA: 0s - loss: 7.0098 - acc: 0.561 - ETA: 0s - loss: 7.0036 - acc: 0.562 - ETA: 0s - loss: 6.9976 - acc: 0.562 - 5s 680us/step - loss: 6.9935 - acc: 0.5627 - val_loss: 7.6327 - val_acc: 0.5006 Epoch 00014: val_loss improved from 7.65635 to 7.63269, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 15/20 6680/6680 [==============================] - ETA: 3s - loss: 7.8979 - acc: 0.510 - ETA: 3s - loss: 7.8979 - acc: 0.510 - ETA: 3s - loss: 7.5604 - acc: 0.530 - ETA: 3s - loss: 7.4030 - acc: 0.540 - ETA: 3s - loss: 7.4053 - acc: 0.540 - ETA: 3s - loss: 7.5411 - acc: 0.531 - ETA: 3s - loss: 7.5690 - acc: 0.530 - ETA: 3s - loss: 7.5328 - acc: 0.531 - ETA: 3s - loss: 7.3226 - acc: 0.544 - ETA: 3s - loss: 7.2742 - acc: 0.547 - ETA: 3s - loss: 7.2317 - acc: 0.549 - ETA: 3s - loss: 7.1529 - acc: 0.554 - ETA: 3s - loss: 7.0738 - acc: 0.559 - ETA: 3s - loss: 7.0636 - acc: 0.560 - ETA: 3s - loss: 7.0547 - acc: 0.560 - ETA: 3s - loss: 7.0167 - acc: 0.563 - ETA: 3s - loss: 7.0212 - acc: 0.562 - ETA: 3s - loss: 7.0021 - acc: 0.563 - ETA: 3s - loss: 7.0662 - acc: 0.560 - ETA: 2s - loss: 7.1263 - acc: 0.556 - ETA: 2s - loss: 7.1638 - acc: 0.553 - ETA: 2s - loss: 7.0726 - acc: 0.559 - ETA: 2s - loss: 7.0594 - acc: 0.560 - ETA: 2s - loss: 7.0770 - acc: 0.558 - ETA: 2s - loss: 7.1025 - acc: 0.557 - ETA: 2s - loss: 7.0898 - acc: 0.558 - ETA: 2s - loss: 7.0959 - acc: 0.557 - ETA: 2s - loss: 7.0901 - acc: 0.558 - ETA: 2s - loss: 7.0958 - acc: 0.557 - ETA: 2s - loss: 7.0903 - acc: 0.558 - ETA: 2s - loss: 7.1114 - acc: 0.557 - ETA: 2s - loss: 7.0717 - acc: 0.559 - ETA: 2s - loss: 7.0889 - acc: 0.558 - ETA: 2s - loss: 7.1271 - acc: 0.555 - ETA: 2s - loss: 7.1537 - acc: 0.554 - ETA: 1s - loss: 7.1207 - acc: 0.556 - ETA: 1s - loss: 7.1025 - acc: 0.557 - ETA: 1s - loss: 7.0174 - acc: 0.562 - ETA: 1s - loss: 6.9821 - acc: 0.565 - ETA: 1s - loss: 7.0010 - acc: 0.564 - ETA: 1s - loss: 6.9639 - acc: 0.566 - ETA: 1s - loss: 6.9863 - acc: 0.565 - ETA: 1s - loss: 6.9552 - acc: 0.566 - ETA: 1s - loss: 6.9547 - acc: 0.566 - ETA: 1s - loss: 6.9775 - acc: 0.565 - ETA: 1s - loss: 6.9695 - acc: 0.565 - ETA: 1s - loss: 6.9551 - acc: 0.566 - ETA: 1s - loss: 6.9411 - acc: 0.567 - ETA: 1s - loss: 6.9475 - acc: 0.567 - ETA: 1s - loss: 6.9536 - acc: 0.567 - ETA: 1s - loss: 6.9409 - acc: 0.567 - ETA: 0s - loss: 6.9345 - acc: 0.568 - ETA: 0s - loss: 6.9253 - acc: 0.568 - ETA: 0s - loss: 6.9015 - acc: 0.570 - ETA: 0s - loss: 6.9196 - acc: 0.569 - ETA: 0s - loss: 6.9143 - acc: 0.569 - ETA: 0s - loss: 6.8948 - acc: 0.570 - ETA: 0s - loss: 6.8982 - acc: 0.570 - ETA: 0s - loss: 6.8933 - acc: 0.570 - ETA: 0s - loss: 6.9106 - acc: 0.569 - ETA: 0s - loss: 6.9242 - acc: 0.568 - ETA: 0s - loss: 6.9495 - acc: 0.567 - ETA: 0s - loss: 6.9610 - acc: 0.566 - ETA: 0s - loss: 6.9631 - acc: 0.566 - ETA: 0s - loss: 6.9675 - acc: 0.566 - ETA: 0s - loss: 6.9654 - acc: 0.565 - 5s 677us/step - loss: 6.9761 - acc: 0.5653 - val_loss: 7.7325 - val_acc: 0.4898 Epoch 00015: val_loss did not improve Epoch 16/20 6680/6680 [==============================] - ETA: 3s - loss: 7.4570 - acc: 0.530 - ETA: 3s - loss: 6.7835 - acc: 0.565 - ETA: 3s - loss: 7.1125 - acc: 0.546 - ETA: 3s - loss: 7.3088 - acc: 0.537 - ETA: 3s - loss: 7.1055 - acc: 0.552 - ETA: 3s - loss: 7.0236 - acc: 0.558 - ETA: 3s - loss: 6.9644 - acc: 0.562 - ETA: 3s - loss: 7.0609 - acc: 0.557 - ETA: 3s - loss: 7.0823 - acc: 0.556 - ETA: 3s - loss: 6.9866 - acc: 0.563 - ETA: 3s - loss: 6.9084 - acc: 0.568 - ETA: 3s - loss: 6.8431 - acc: 0.572 - ETA: 3s - loss: 6.8279 - acc: 0.573 - ETA: 3s - loss: 6.7661 - acc: 0.577 - ETA: 3s - loss: 6.7031 - acc: 0.580 - ETA: 3s - loss: 6.7053 - acc: 0.580 - ETA: 3s - loss: 6.6854 - acc: 0.581 - ETA: 3s - loss: 6.6450 - acc: 0.583 - ETA: 3s - loss: 6.6921 - acc: 0.581 - ETA: 2s - loss: 6.7420 - acc: 0.578 - ETA: 2s - loss: 6.8031 - acc: 0.573 - ETA: 2s - loss: 6.7881 - acc: 0.574 - ETA: 2s - loss: 6.7539 - acc: 0.576 - ETA: 2s - loss: 6.7602 - acc: 0.575 - ETA: 2s - loss: 6.7822 - acc: 0.574 - ETA: 2s - loss: 6.7560 - acc: 0.575 - ETA: 2s - loss: 6.7492 - acc: 0.575 - ETA: 2s - loss: 6.7683 - acc: 0.574 - ETA: 2s - loss: 6.7616 - acc: 0.573 - ETA: 2s - loss: 6.7747 - acc: 0.572 - ETA: 2s - loss: 6.7813 - acc: 0.571 - ETA: 2s - loss: 6.7800 - acc: 0.570 - ETA: 2s - loss: 6.7414 - acc: 0.573 - ETA: 2s - loss: 6.7676 - acc: 0.571 - ETA: 2s - loss: 6.7639 - acc: 0.572 - ETA: 1s - loss: 6.7615 - acc: 0.571 - ETA: 1s - loss: 6.7108 - acc: 0.573 - ETA: 1s - loss: 6.7013 - acc: 0.573 - ETA: 1s - loss: 6.6927 - acc: 0.574 - ETA: 1s - loss: 6.6874 - acc: 0.573 - ETA: 1s - loss: 6.6696 - acc: 0.573 - ETA: 1s - loss: 6.6445 - acc: 0.574 - ETA: 1s - loss: 6.6105 - acc: 0.576 - ETA: 1s - loss: 6.6130 - acc: 0.576 - ETA: 1s - loss: 6.6268 - acc: 0.576 - ETA: 1s - loss: 6.5958 - acc: 0.577 - ETA: 1s - loss: 6.5769 - acc: 0.578 - ETA: 1s - loss: 6.5676 - acc: 0.578 - ETA: 1s - loss: 6.5563 - acc: 0.579 - ETA: 1s - loss: 6.5564 - acc: 0.579 - ETA: 1s - loss: 6.5439 - acc: 0.579 - ETA: 0s - loss: 6.5035 - acc: 0.582 - ETA: 0s - loss: 6.4834 - acc: 0.583 - ETA: 0s - loss: 6.4543 - acc: 0.584 - ETA: 0s - loss: 6.4279 - acc: 0.585 - ETA: 0s - loss: 6.4028 - acc: 0.587 - ETA: 0s - loss: 6.3692 - acc: 0.588 - ETA: 0s - loss: 6.3363 - acc: 0.590 - ETA: 0s - loss: 6.3116 - acc: 0.591 - ETA: 0s - loss: 6.2915 - acc: 0.593 - ETA: 0s - loss: 6.2730 - acc: 0.594 - ETA: 0s - loss: 6.2254 - acc: 0.597 - ETA: 0s - loss: 6.2098 - acc: 0.597 - ETA: 0s - loss: 6.1939 - acc: 0.598 - ETA: 0s - loss: 6.1784 - acc: 0.599 - ETA: 0s - loss: 6.1731 - acc: 0.599 - 5s 677us/step - loss: 6.1556 - acc: 0.6006 - val_loss: 5.6203 - val_acc: 0.5940 Epoch 00016: val_loss improved from 7.63269 to 5.62033, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 17/20 6680/6680 [==============================] - ETA: 3s - loss: 4.7000 - acc: 0.660 - ETA: 3s - loss: 4.5410 - acc: 0.685 - ETA: 3s - loss: 4.2656 - acc: 0.710 - ETA: 3s - loss: 4.2559 - acc: 0.710 - ETA: 3s - loss: 4.5158 - acc: 0.698 - ETA: 3s - loss: 4.5670 - acc: 0.695 - ETA: 3s - loss: 4.4535 - acc: 0.702 - ETA: 3s - loss: 4.4428 - acc: 0.705 - ETA: 3s - loss: 4.4938 - acc: 0.702 - ETA: 3s - loss: 4.4398 - acc: 0.706 - ETA: 3s - loss: 4.4631 - acc: 0.705 - ETA: 3s - loss: 4.4885 - acc: 0.702 - ETA: 3s - loss: 4.4934 - acc: 0.700 - ETA: 3s - loss: 4.4981 - acc: 0.700 - ETA: 3s - loss: 4.5108 - acc: 0.700 - ETA: 3s - loss: 4.4808 - acc: 0.702 - ETA: 3s - loss: 4.5083 - acc: 0.701 - ETA: 3s - loss: 4.5567 - acc: 0.697 - ETA: 3s - loss: 4.5672 - acc: 0.696 - ETA: 2s - loss: 4.5770 - acc: 0.694 - ETA: 2s - loss: 4.5402 - acc: 0.696 - ETA: 2s - loss: 4.5397 - acc: 0.695 - ETA: 2s - loss: 4.5216 - acc: 0.697 - ETA: 2s - loss: 4.5118 - acc: 0.697 - ETA: 2s - loss: 4.4691 - acc: 0.700 - ETA: 2s - loss: 4.4467 - acc: 0.702 - ETA: 2s - loss: 4.4611 - acc: 0.701 - ETA: 2s - loss: 4.4706 - acc: 0.698 - ETA: 2s - loss: 4.4858 - acc: 0.697 - ETA: 2s - loss: 4.5062 - acc: 0.696 - ETA: 2s - loss: 4.5334 - acc: 0.694 - ETA: 2s - loss: 4.5103 - acc: 0.696 - ETA: 2s - loss: 4.4831 - acc: 0.697 - ETA: 2s - loss: 4.4660 - acc: 0.698 - ETA: 2s - loss: 4.4797 - acc: 0.697 - ETA: 1s - loss: 4.4704 - acc: 0.697 - ETA: 1s - loss: 4.4696 - acc: 0.698 - ETA: 1s - loss: 4.4699 - acc: 0.698 - ETA: 1s - loss: 4.4769 - acc: 0.697 - ETA: 1s - loss: 4.4772 - acc: 0.697 - ETA: 1s - loss: 4.4781 - acc: 0.697 - ETA: 1s - loss: 4.4912 - acc: 0.695 - ETA: 1s - loss: 4.4981 - acc: 0.694 - ETA: 1s - loss: 4.4901 - acc: 0.694 - ETA: 1s - loss: 4.4622 - acc: 0.696 - ETA: 1s - loss: 4.4390 - acc: 0.697 - ETA: 1s - loss: 4.4505 - acc: 0.696 - ETA: 1s - loss: 4.4640 - acc: 0.695 - ETA: 1s - loss: 4.4505 - acc: 0.696 - ETA: 1s - loss: 4.4519 - acc: 0.696 - ETA: 1s - loss: 4.4616 - acc: 0.696 - ETA: 0s - loss: 4.4622 - acc: 0.696 - ETA: 0s - loss: 4.4544 - acc: 0.695 - ETA: 0s - loss: 4.4275 - acc: 0.697 - ETA: 0s - loss: 4.4182 - acc: 0.698 - ETA: 0s - loss: 4.4259 - acc: 0.697 - ETA: 0s - loss: 4.4321 - acc: 0.696 - ETA: 0s - loss: 4.4211 - acc: 0.697 - ETA: 0s - loss: 4.4153 - acc: 0.697 - ETA: 0s - loss: 4.4107 - acc: 0.697 - ETA: 0s - loss: 4.4074 - acc: 0.697 - ETA: 0s - loss: 4.4074 - acc: 0.697 - ETA: 0s - loss: 4.3910 - acc: 0.698 - ETA: 0s - loss: 4.3706 - acc: 0.699 - ETA: 0s - loss: 4.3694 - acc: 0.700 - ETA: 0s - loss: 4.3461 - acc: 0.700 - 5s 676us/step - loss: 4.3483 - acc: 0.7010 - val_loss: 4.7988 - val_acc: 0.6395 Epoch 00017: val_loss improved from 5.62033 to 4.79876, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 18/20 6680/6680 [==============================] - ETA: 3s - loss: 3.0834 - acc: 0.770 - ETA: 4s - loss: 2.9350 - acc: 0.775 - ETA: 4s - loss: 3.0241 - acc: 0.776 - ETA: 4s - loss: 3.3529 - acc: 0.762 - ETA: 3s - loss: 3.3154 - acc: 0.768 - ETA: 3s - loss: 3.4426 - acc: 0.761 - ETA: 3s - loss: 3.5360 - acc: 0.758 - ETA: 3s - loss: 3.6334 - acc: 0.752 - ETA: 3s - loss: 3.7019 - acc: 0.748 - ETA: 3s - loss: 3.6753 - acc: 0.752 - ETA: 3s - loss: 3.6961 - acc: 0.750 - ETA: 3s - loss: 3.6847 - acc: 0.750 - ETA: 3s - loss: 3.7514 - acc: 0.746 - ETA: 3s - loss: 3.7260 - acc: 0.748 - ETA: 3s - loss: 3.7232 - acc: 0.749 - ETA: 3s - loss: 3.7590 - acc: 0.746 - ETA: 3s - loss: 3.7196 - acc: 0.749 - ETA: 3s - loss: 3.7014 - acc: 0.751 - ETA: 3s - loss: 3.7161 - acc: 0.751 - ETA: 2s - loss: 3.6478 - acc: 0.755 - ETA: 2s - loss: 3.6015 - acc: 0.757 - ETA: 2s - loss: 3.5943 - acc: 0.758 - ETA: 2s - loss: 3.6241 - acc: 0.755 - ETA: 2s - loss: 3.6609 - acc: 0.753 - ETA: 2s - loss: 3.6661 - acc: 0.753 - ETA: 2s - loss: 3.6510 - acc: 0.754 - ETA: 2s - loss: 3.6042 - acc: 0.756 - ETA: 2s - loss: 3.6525 - acc: 0.753 - ETA: 2s - loss: 3.6272 - acc: 0.755 - ETA: 2s - loss: 3.6207 - acc: 0.755 - ETA: 2s - loss: 3.6292 - acc: 0.755 - ETA: 2s - loss: 3.6303 - acc: 0.755 - ETA: 2s - loss: 3.6245 - acc: 0.754 - ETA: 2s - loss: 3.6440 - acc: 0.753 - ETA: 2s - loss: 3.6246 - acc: 0.754 - ETA: 1s - loss: 3.6053 - acc: 0.756 - ETA: 1s - loss: 3.6043 - acc: 0.756 - ETA: 1s - loss: 3.6188 - acc: 0.755 - ETA: 1s - loss: 3.6059 - acc: 0.755 - ETA: 1s - loss: 3.6151 - acc: 0.755 - ETA: 1s - loss: 3.6265 - acc: 0.754 - ETA: 1s - loss: 3.6327 - acc: 0.754 - ETA: 1s - loss: 3.6516 - acc: 0.753 - ETA: 1s - loss: 3.6628 - acc: 0.752 - ETA: 1s - loss: 3.6736 - acc: 0.751 - ETA: 1s - loss: 3.6828 - acc: 0.750 - ETA: 1s - loss: 3.6798 - acc: 0.749 - ETA: 1s - loss: 3.6717 - acc: 0.750 - ETA: 1s - loss: 3.6777 - acc: 0.750 - ETA: 1s - loss: 3.6743 - acc: 0.749 - ETA: 1s - loss: 3.6717 - acc: 0.750 - ETA: 0s - loss: 3.6812 - acc: 0.749 - ETA: 0s - loss: 3.6910 - acc: 0.748 - ETA: 0s - loss: 3.6709 - acc: 0.749 - ETA: 0s - loss: 3.6697 - acc: 0.750 - ETA: 0s - loss: 3.6862 - acc: 0.748 - ETA: 0s - loss: 3.6941 - acc: 0.748 - ETA: 0s - loss: 3.6874 - acc: 0.748 - ETA: 0s - loss: 3.7013 - acc: 0.747 - ETA: 0s - loss: 3.7201 - acc: 0.746 - ETA: 0s - loss: 3.7176 - acc: 0.746 - ETA: 0s - loss: 3.7281 - acc: 0.746 - ETA: 0s - loss: 3.7431 - acc: 0.745 - ETA: 0s - loss: 3.7631 - acc: 0.744 - ETA: 0s - loss: 3.7474 - acc: 0.745 - ETA: 0s - loss: 3.7480 - acc: 0.745 - 5s 678us/step - loss: 3.7365 - acc: 0.7458 - val_loss: 4.5459 - val_acc: 0.6671 Epoch 00018: val_loss improved from 4.79876 to 4.54594, saving model to saved_models/weights.best.Xception4.hdf5 Epoch 19/20 6680/6680 [==============================] - ETA: 3s - loss: 3.8684 - acc: 0.760 - ETA: 3s - loss: 3.3148 - acc: 0.775 - ETA: 3s - loss: 3.4728 - acc: 0.770 - ETA: 3s - loss: 3.3819 - acc: 0.775 - ETA: 3s - loss: 3.2932 - acc: 0.782 - ETA: 3s - loss: 3.5097 - acc: 0.770 - ETA: 3s - loss: 3.5227 - acc: 0.770 - ETA: 3s - loss: 3.6599 - acc: 0.760 - ETA: 3s - loss: 3.6513 - acc: 0.760 - ETA: 3s - loss: 3.6221 - acc: 0.762 - ETA: 3s - loss: 3.6206 - acc: 0.760 - ETA: 3s - loss: 3.6025 - acc: 0.761 - ETA: 3s - loss: 3.5738 - acc: 0.764 - ETA: 3s - loss: 3.5509 - acc: 0.764 - ETA: 3s - loss: 3.5708 - acc: 0.762 - ETA: 3s - loss: 3.5506 - acc: 0.763 - ETA: 3s - loss: 3.5987 - acc: 0.761 - ETA: 3s - loss: 3.5767 - acc: 0.761 - ETA: 2s - loss: 3.5350 - acc: 0.764 - ETA: 2s - loss: 3.5251 - acc: 0.763 - ETA: 2s - loss: 3.5607 - acc: 0.759 - ETA: 2s - loss: 3.5835 - acc: 0.758 - ETA: 2s - loss: 3.5820 - acc: 0.759 - ETA: 2s - loss: 3.5754 - acc: 0.760 - ETA: 2s - loss: 3.5799 - acc: 0.759 - ETA: 2s - loss: 3.5840 - acc: 0.758 - ETA: 2s - loss: 3.5577 - acc: 0.760 - ETA: 2s - loss: 3.5752 - acc: 0.759 - ETA: 2s - loss: 3.5402 - acc: 0.761 - ETA: 2s - loss: 3.5409 - acc: 0.761 - ETA: 2s - loss: 3.4959 - acc: 0.762 - ETA: 2s - loss: 3.5013 - acc: 0.762 - ETA: 2s - loss: 3.5078 - acc: 0.762 - ETA: 2s - loss: 3.5033 - acc: 0.762 - ETA: 1s - loss: 3.5185 - acc: 0.761 - ETA: 1s - loss: 3.5125 - acc: 0.761 - ETA: 1s - loss: 3.5120 - acc: 0.761 - ETA: 1s - loss: 3.5039 - acc: 0.762 - ETA: 1s - loss: 3.4816 - acc: 0.763 - ETA: 1s - loss: 3.4948 - acc: 0.761 - ETA: 1s - loss: 3.4867 - acc: 0.762 - ETA: 1s - loss: 3.5154 - acc: 0.760 - ETA: 1s - loss: 3.4798 - acc: 0.762 - ETA: 1s - loss: 3.4771 - acc: 0.762 - ETA: 1s - loss: 3.4793 - acc: 0.762 - ETA: 1s - loss: 3.4743 - acc: 0.763 - ETA: 1s - loss: 3.4694 - acc: 0.764 - ETA: 1s - loss: 3.4752 - acc: 0.764 - ETA: 1s - loss: 3.4635 - acc: 0.765 - ETA: 1s - loss: 3.4793 - acc: 0.764 - ETA: 0s - loss: 3.4768 - acc: 0.764 - ETA: 0s - loss: 3.4859 - acc: 0.764 - ETA: 0s - loss: 3.4721 - acc: 0.765 - ETA: 0s - loss: 3.4634 - acc: 0.765 - ETA: 0s - loss: 3.4708 - acc: 0.765 - ETA: 0s - loss: 3.4780 - acc: 0.765 - ETA: 0s - loss: 3.4737 - acc: 0.765 - ETA: 0s - loss: 3.4831 - acc: 0.765 - ETA: 0s - loss: 3.4896 - acc: 0.765 - ETA: 0s - loss: 3.4753 - acc: 0.765 - ETA: 0s - loss: 3.4737 - acc: 0.766 - ETA: 0s - loss: 3.4806 - acc: 0.765 - ETA: 0s - loss: 3.4763 - acc: 0.765 - ETA: 0s - loss: 3.4735 - acc: 0.765 - ETA: 0s - loss: 3.4838 - acc: 0.765 - ETA: 0s - loss: 3.4812 - acc: 0.764 - 5s 674us/step - loss: 3.4821 - acc: 0.7650 - val_loss: 4.6745 - val_acc: 0.6491 Epoch 00019: val_loss did not improve Epoch 20/20 6680/6680 [==============================] - ETA: 4s - loss: 3.1031 - acc: 0.800 - ETA: 4s - loss: 3.1965 - acc: 0.785 - ETA: 4s - loss: 3.6650 - acc: 0.756 - ETA: 3s - loss: 3.7241 - acc: 0.755 - ETA: 3s - loss: 4.1232 - acc: 0.730 - ETA: 3s - loss: 3.9285 - acc: 0.736 - ETA: 3s - loss: 3.8861 - acc: 0.740 - ETA: 3s - loss: 3.7460 - acc: 0.750 - ETA: 3s - loss: 3.6939 - acc: 0.751 - ETA: 3s - loss: 3.7171 - acc: 0.751 - ETA: 3s - loss: 3.6398 - acc: 0.756 - ETA: 3s - loss: 3.5604 - acc: 0.760 - ETA: 3s - loss: 3.4930 - acc: 0.765 - ETA: 3s - loss: 3.4702 - acc: 0.767 - ETA: 3s - loss: 3.5353 - acc: 0.763 - ETA: 3s - loss: 3.5788 - acc: 0.761 - ETA: 3s - loss: 3.5329 - acc: 0.763 - ETA: 3s - loss: 3.6080 - acc: 0.758 - ETA: 3s - loss: 3.5476 - acc: 0.762 - ETA: 2s - loss: 3.5324 - acc: 0.764 - ETA: 2s - loss: 3.4597 - acc: 0.768 - ETA: 2s - loss: 3.4799 - acc: 0.766 - ETA: 2s - loss: 3.5110 - acc: 0.765 - ETA: 2s - loss: 3.4837 - acc: 0.767 - ETA: 2s - loss: 3.4539 - acc: 0.769 - ETA: 2s - loss: 3.4534 - acc: 0.769 - ETA: 2s - loss: 3.4765 - acc: 0.767 - ETA: 2s - loss: 3.4804 - acc: 0.767 - ETA: 2s - loss: 3.4609 - acc: 0.768 - ETA: 2s - loss: 3.4542 - acc: 0.768 - ETA: 2s - loss: 3.4367 - acc: 0.769 - ETA: 2s - loss: 3.4254 - acc: 0.770 - ETA: 2s - loss: 3.4307 - acc: 0.770 - ETA: 2s - loss: 3.4400 - acc: 0.770 - ETA: 2s - loss: 3.4217 - acc: 0.770 - ETA: 1s - loss: 3.4165 - acc: 0.771 - ETA: 1s - loss: 3.4028 - acc: 0.772 - ETA: 1s - loss: 3.4055 - acc: 0.772 - ETA: 1s - loss: 3.3883 - acc: 0.773 - ETA: 1s - loss: 3.4046 - acc: 0.772 - ETA: 1s - loss: 3.4302 - acc: 0.771 - ETA: 1s - loss: 3.4215 - acc: 0.771 - ETA: 1s - loss: 3.4183 - acc: 0.771 - ETA: 1s - loss: 3.4199 - acc: 0.772 - ETA: 1s - loss: 3.4167 - acc: 0.772 - ETA: 1s - loss: 3.4148 - acc: 0.772 - ETA: 1s - loss: 3.3936 - acc: 0.774 - ETA: 1s - loss: 3.3774 - acc: 0.775 - ETA: 1s - loss: 3.3672 - acc: 0.775 - ETA: 1s - loss: 3.3547 - acc: 0.776 - ETA: 1s - loss: 3.3234 - acc: 0.778 - ETA: 0s - loss: 3.3152 - acc: 0.779 - ETA: 0s - loss: 3.3369 - acc: 0.777 - ETA: 0s - loss: 3.3314 - acc: 0.778 - ETA: 0s - loss: 3.3262 - acc: 0.778 - ETA: 0s - loss: 3.3142 - acc: 0.778 - ETA: 0s - loss: 3.3120 - acc: 0.778 - ETA: 0s - loss: 3.3153 - acc: 0.778 - ETA: 0s - loss: 3.3084 - acc: 0.779 - ETA: 0s - loss: 3.2990 - acc: 0.780 - ETA: 0s - loss: 3.3038 - acc: 0.779 - ETA: 0s - loss: 3.2859 - acc: 0.780 - ETA: 0s - loss: 3.2867 - acc: 0.780 - ETA: 0s - loss: 3.2743 - acc: 0.781 - ETA: 0s - loss: 3.2846 - acc: 0.780 - ETA: 0s - loss: 3.2895 - acc: 0.780 - 5s 678us/step - loss: 3.3014 - acc: 0.7792 - val_loss: 4.2288 - val_acc: 0.6802 Epoch 00020: val_loss improved from 4.54594 to 4.22884, saving model to saved_models/weights.best.Xception4.hdf5
<keras.callbacks.History at 0x1c9353fefd0>
### TODO: Load the model weights with the best validation loss.
VGG19_model.load_weights('saved_models/weights.best.VGG19.hdf5')
VGG19_model1.load_weights('saved_models/weights.best.VGG191.hdf5')
VGG19_model2.load_weights('saved_models/weights.best.VGG192.hdf5')
VGG19_model3.load_weights('saved_models/weights.best.VGG193.hdf5')
VGG19_model4.load_weights('saved_models/weights.best.VGG194.hdf5')
InceptionV3_model.load_weights('saved_models/weights.best.InceptionV3.hdf5')
InceptionV3_model1.load_weights('saved_models/weights.best.InceptionV31.hdf5')
InceptionV3_model2.load_weights('saved_models/weights.best.InceptionV32.hdf5')
InceptionV3_model3.load_weights('saved_models/weights.best.InceptionV33.hdf5')
InceptionV3_model4.load_weights('saved_models/weights.best.InceptionV34.hdf5')
Resnet50_model.load_weights('saved_models/weights.best.Resnet50.hdf5')
#Resnet50_model1.load_weights('saved_models/weights.best.Resnet501.hdf5')
Resnet50_model2.load_weights('saved_models/weights.best.Resnet502.hdf5')
Resnet50_model3.load_weights('saved_models/weights.best.Resnet503.hdf5')
#Resnet50_model4.load_weights('saved_models/weights.best.Resnet504.hdf5')
Xception_model.load_weights('saved_models/weights.best.Xception.hdf5')
Xception_model1.load_weights('saved_models/weights.best.Xception1.hdf5')
Xception_model2.load_weights('saved_models/weights.best.Xception2.hdf5')
Xception_model3.load_weights('saved_models/weights.best.Xception3.hdf5')
Xception_model4.load_weights('saved_models/weights.best.Xception4.hdf5')
Try out your model on the test dataset of dog images. Ensure that your test accuracy is greater than 60%.
### TODO: Calculate classification accuracy on the test dataset.
# VGG19 model
print('for VGG19 model:')
# get index of predicted dog breed for each image in test set
VGG19_predictions = [np.argmax(VGG19_model.predict(np.expand_dims(feature, axis=0))) for feature in test_VGG19]
# report test accuracy
test_accuracy = 100*np.sum(np.array(VGG19_predictions)==np.argmax(test_targets, axis=1))/len(VGG19_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for VGG19 model 1:')
# get index of predicted dog breed for each image in test set
VGG19_predictions = [np.argmax(VGG19_model1.predict(np.expand_dims(feature, axis=0))) for feature in test_VGG19]
# report test accuracy
test_accuracy = 100*np.sum(np.array(VGG19_predictions)==np.argmax(test_targets, axis=1))/len(VGG19_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for VGG19 model 2:')
# get index of predicted dog breed for each image in test set
VGG19_predictions = [np.argmax(VGG19_model2.predict(np.expand_dims(feature, axis=0))) for feature in test_VGG19]
# report test accuracy
test_accuracy = 100*np.sum(np.array(VGG19_predictions)==np.argmax(test_targets, axis=1))/len(VGG19_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for VGG19 model 3:')
# get index of predicted dog breed for each image in test set
VGG19_predictions = [np.argmax(VGG19_model3.predict(np.expand_dims(feature, axis=0))) for feature in test_VGG19]
# report test accuracy
test_accuracy = 100*np.sum(np.array(VGG19_predictions)==np.argmax(test_targets, axis=1))/len(VGG19_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for VGG19 model 4:')
# get index of predicted dog breed for each image in test set
VGG19_predictions = [np.argmax(VGG19_model4.predict(np.expand_dims(feature, axis=0))) for feature in test_VGG19]
# report test accuracy
test_accuracy = 100*np.sum(np.array(VGG19_predictions)==np.argmax(test_targets, axis=1))/len(VGG19_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
# InceptionV3 model
print('for InceptionV3 model:')
# get index of predicted dog breed for each image in test set
InceptionV3_predictions = [np.argmax(InceptionV3_model.predict(np.expand_dims(feature, axis=0))) for feature in test_InceptionV3]
# report test accuracy
test_accuracy = 100*np.sum(np.array(InceptionV3_predictions)==np.argmax(test_targets, axis=1))/len(InceptionV3_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for InceptionV3 model 1:')
# get index of predicted dog breed for each image in test set
InceptionV3_predictions = [np.argmax(InceptionV3_model1.predict(np.expand_dims(feature, axis=0))) for feature in test_InceptionV3]
# report test accuracy
test_accuracy = 100*np.sum(np.array(InceptionV3_predictions)==np.argmax(test_targets, axis=1))/len(InceptionV3_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for InceptionV3 model 2:')
# get index of predicted dog breed for each image in test set
InceptionV3_predictions = [np.argmax(InceptionV3_model2.predict(np.expand_dims(feature, axis=0))) for feature in test_InceptionV3]
# report test accuracy
test_accuracy = 100*np.sum(np.array(InceptionV3_predictions)==np.argmax(test_targets, axis=1))/len(InceptionV3_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for InceptionV3 model 3:')
# get index of predicted dog breed for each image in test set
InceptionV3_predictions = [np.argmax(InceptionV3_model3.predict(np.expand_dims(feature, axis=0))) for feature in test_InceptionV3]
# report test accuracy
test_accuracy = 100*np.sum(np.array(InceptionV3_predictions)==np.argmax(test_targets, axis=1))/len(InceptionV3_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for InceptionV3 model 4:')
# get index of predicted dog breed for each image in test set
InceptionV3_predictions = [np.argmax(InceptionV3_model4.predict(np.expand_dims(feature, axis=0))) for feature in test_InceptionV3]
# report test accuracy
test_accuracy = 100*np.sum(np.array(InceptionV3_predictions)==np.argmax(test_targets, axis=1))/len(InceptionV3_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
# Resnet50 model
print('for Resnet50 model:')
# get index of predicted dog breed for each image in test set
Resnet50_predictions = [np.argmax(Resnet50_model.predict(np.expand_dims(feature, axis=0))) for feature in test_Resnet50]
# report test accuracy
test_accuracy = 100*np.sum(np.array(Resnet50_predictions)==np.argmax(test_targets, axis=1))/len(Resnet50_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
#print('for Resnet50 model 1:')
# get index of predicted dog breed for each image in test set
#Resnet50_predictions = [np.argmax(Resnet50_model1.predict(np.expand_dims(feature, axis=0))) for feature in test_Resnet50]
# report test accuracy
#test_accuracy = 100*np.sum(np.array(Resnet50_predictions)==np.argmax(test_targets, axis=1))/len(Resnet50_predictions)
#print('Test accuracy: %.4f%%' % test_accuracy)
print('for Resnet50 model 2:')
# get index of predicted dog breed for each image in test set
Resnet50_predictions = [np.argmax(Resnet50_model2.predict(np.expand_dims(feature, axis=0))) for feature in test_Resnet50]
# report test accuracy
test_accuracy = 100*np.sum(np.array(Resnet50_predictions)==np.argmax(test_targets, axis=1))/len(Resnet50_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for Resnet50 model 3:')
# get index of predicted dog breed for each image in test set
Resnet50_predictions = [np.argmax(Resnet50_model3.predict(np.expand_dims(feature, axis=0))) for feature in test_Resnet50]
# report test accuracy
test_accuracy = 100*np.sum(np.array(Resnet50_predictions)==np.argmax(test_targets, axis=1))/len(Resnet50_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
#print('for Resnet50 model 4:')
# get index of predicted dog breed for each image in test set
#Resnet50_predictions = [np.argmax(Resnet50_model4.predict(np.expand_dims(feature, axis=0))) for feature in test_Resnet50]
# report test accuracy
#test_accuracy = 100*np.sum(np.array(Resnet50_predictions)==np.argmax(test_targets, axis=1))/len(Resnet50_predictions)
#print('Test accuracy: %.4f%%' % test_accuracy)
# Xception model
print('for Xception model:')
# get index of predicted dog breed for each image in test set
Xception_predictions = [np.argmax(Xception_model.predict(np.expand_dims(feature, axis=0))) for feature in test_Xception]
# report test accuracy
test_accuracy = 100*np.sum(np.array(Xception_predictions)==np.argmax(test_targets, axis=1))/len(Xception_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for Xception model 1:')
# get index of predicted dog breed for each image in test set
Xception_predictions = [np.argmax(Xception_model1.predict(np.expand_dims(feature, axis=0))) for feature in test_Xception]
# report test accuracy
test_accuracy = 100*np.sum(np.array(Xception_predictions)==np.argmax(test_targets, axis=1))/len(Xception_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for Xception model 2:')
# get index of predicted dog breed for each image in test set
Xception_predictions = [np.argmax(Xception_model2.predict(np.expand_dims(feature, axis=0))) for feature in test_Xception]
# report test accuracy
test_accuracy = 100*np.sum(np.array(Xception_predictions)==np.argmax(test_targets, axis=1))/len(Xception_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for Xception model 3:')
# get index of predicted dog breed for each image in test set
Xception_predictions = [np.argmax(Xception_model3.predict(np.expand_dims(feature, axis=0))) for feature in test_Xception]
# report test accuracy
test_accuracy = 100*np.sum(np.array(Xception_predictions)==np.argmax(test_targets, axis=1))/len(Xception_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
print('for Xception model 4:')
# get index of predicted dog breed for each image in test set
Xception_predictions = [np.argmax(Xception_model4.predict(np.expand_dims(feature, axis=0))) for feature in test_Xception]
# report test accuracy
test_accuracy = 100*np.sum(np.array(Xception_predictions)==np.argmax(test_targets, axis=1))/len(Xception_predictions)
print('Test accuracy: %.4f%%' % test_accuracy)
for VGG19 model: Test accuracy: 7.5359% for VGG19 model 1: Test accuracy: 14.2344% for VGG19 model 2: Test accuracy: 51.6746% for VGG19 model 3: Test accuracy: 0.7177% for VGG19 model 4: Test accuracy: 7.7751% for InceptionV3 model: Test accuracy: 42.3445% for InceptionV3 model 1: Test accuracy: 81.5789% for InceptionV3 model 2: Test accuracy: 80.9809% for InceptionV3 model 3: Test accuracy: 48.0861% for InceptionV3 model 4: Test accuracy: 37.7990% for Resnet50 model: Test accuracy: 81.2201% for Resnet50 model 2: Test accuracy: 82.1770% for Resnet50 model 3: Test accuracy: 81.5789% for Xception model: Test accuracy: 46.5311% for Xception model 1: Test accuracy: 78.1100% for Xception model 2: Test accuracy: 85.4067% for Xception model 3: Test accuracy: 82.2967% for Xception model 4: Test accuracy: 67.4641%
Write a function that takes an image path as input and returns the dog breed (Affenpinscher, Afghan_hound, etc) that is predicted by your model.
Similar to the analogous function in Step 5, your function should have three steps:
dog_names array defined in Step 0 of this notebook to return the corresponding breed.The functions to extract the bottleneck features can be found in extract_bottleneck_features.py, and they have been imported in an earlier code cell. To obtain the bottleneck features corresponding to your chosen CNN architecture, you need to use the function
extract_{network}
where {network}, in the above filename, should be one of VGG19, Resnet50, InceptionV3, or Xception.
### TODO: Write a function that takes a path to an image as input
### and returns the dog breed that is predicted by the model.
from extract_bottleneck_features import *
def dog_breed_predict(img_path):
# extract bottleneck features
bottleneck_feature = extract_Xception(path_to_tensor(img_path))
# obtain predicted vector
predicted_vector = Xception_model2.predict(bottleneck_feature)
# return dog breed that is predicted by the model
return dog_names[np.argmax(predicted_vector)]
Write an algorithm that accepts a file path to an image and first determines whether the image contains a human, dog, or neither. Then,
You are welcome to write your own functions for detecting humans and dogs in images, but feel free to use the face_detector and dog_detector functions developed above. You are required to use your CNN from Step 5 to predict dog breed.
Some sample output for our algorithm is provided below, but feel free to design your own user experience!

### TODO: Write your algorithm.
### Feel free to use as many code cells as needed.
def human_dog_img_predict(img_path):
ret = "error - no humans and no dogs in this picture"
dog_detect = dog_detector(img_path)
face_detect = face_detector(img_path)
if not face_detect and not dog_detect:
return ret
ret = ""
if dog_detect:
ret += "dog found and it's breed is: " + dog_breed_predict(img_path)
if face_detect:
if len(ret) > 1:
ret += " also a"
ret += "human found and it looks like " + dog_breed_predict(img_path) + " dog"
return ret
In this section, you will take your new algorithm for a spin! What kind of dog does the algorithm think that you look like? If you have a dog, does it predict your dog's breed accurately? If you have a cat, does it mistakenly think that your cat is a dog?
Test your algorithm at least six images on your computer. Feel free to use any images you like. Use at least two human and two dog images.
Question 6: Is the output better than you expected :) ? Or worse :( ? Provide at least three possible points of improvement for your algorithm.
Answer: the algorithem works well when i images are clear and the dog is the main in the picture. sometimes on the same dog it gives diffrent results depending on the picture. 3 points to improve:
## TODO: Execute your algorithm from Step 6 on
## at least 6 images on your computer.
## Feel free to use as many code cells as needed.
import os
directory = "images\mine"
image_list = []
for filename in os.listdir(directory):
img_path = directory + "\\" + filename
print(img_path)
# load color (BGR) image
img = cv2.imread(img_path)
# convert BGR image to RGB for plotting
cv_rgb = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
# display the image, along with bounding box
plt.imshow(cv_rgb)
plt.show()
print()
print(human_dog_img_predict(img_path))
images\mine\20141001_231450.jpg
error - no humans and no dogs in this picture images\mine\20141018_090447.jpg
error - no humans and no dogs in this picture images\mine\20141018_090528.jpg
error - no humans and no dogs in this picture images\mine\20150113_102321.jpg
human found and it looks like Petit_basset_griffon_vendeen dog images\mine\20150120_141346.jpg
error - no humans and no dogs in this picture images\mine\20150512_064644.jpg
error - no humans and no dogs in this picture images\mine\20150515_095216.jpg
human found and it looks like Labrador_retriever dog images\mine\20150515_095237.jpg
dog found and it's breed is: Manchester_terrier images\mine\20150515_095351.jpg
dog found and it's breed is: German_pinscher images\mine\20150520_081207.jpg
dog found and it's breed is: Belgian_sheepdog also ahuman found and it looks like Belgian_sheepdog dog images\mine\20150919_181612.jpg
error - no humans and no dogs in this picture images\mine\20151006_132621.jpg
error - no humans and no dogs in this picture images\mine\20160303_234307.jpg
dog found and it's breed is: Canaan_dog also ahuman found and it looks like Canaan_dog dog images\mine\20160603_214739.jpg
human found and it looks like Petit_basset_griffon_vendeen dog images\mine\American_water_spaniel_00648.jpg
dog found and it's breed is: Curly-coated_retriever images\mine\Brittany_02625.jpg
dog found and it's breed is: Brittany images\mine\Curly-coated_retriever_03896.jpg
dog found and it's breed is: Curly-coated_retriever images\mine\Labrador_retriever_06449.jpg
dog found and it's breed is: Labrador_retriever images\mine\Labrador_retriever_06455.jpg
dog found and it's breed is: Labrador_retriever images\mine\Labrador_retriever_06457.jpg
dog found and it's breed is: Labrador_retriever images\mine\sample_cnn.png
error - no humans and no dogs in this picture images\mine\sample_dog_output.png
error - no humans and no dogs in this picture images\mine\sample_human_output.png
human found and it looks like Brussels_griffon dog images\mine\Welsh_springer_spaniel_08203.jpg
dog found and it's breed is: Welsh_springer_spaniel






